Jan 30 21:15:47 crc systemd[1]: Starting Kubernetes Kubelet... Jan 30 21:15:47 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:47 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 30 21:15:48 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 30 21:15:49 crc kubenswrapper[4834]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:15:49 crc kubenswrapper[4834]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 30 21:15:49 crc kubenswrapper[4834]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:15:49 crc kubenswrapper[4834]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:15:49 crc kubenswrapper[4834]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 30 21:15:49 crc kubenswrapper[4834]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.289789 4834 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295027 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295058 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295069 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295078 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295087 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295098 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295111 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295121 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295131 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295140 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295148 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295155 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295177 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295185 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295193 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295201 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295208 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295216 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295225 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295233 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295241 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295250 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295258 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295266 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295273 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295281 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295289 4834 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295297 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295304 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295315 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295324 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295333 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295342 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295351 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295360 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295368 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295375 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295383 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295391 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295433 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295443 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295455 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295464 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295472 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295480 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295487 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295495 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295502 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295510 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295518 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295526 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295536 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295546 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295554 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295564 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295572 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295580 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295588 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295595 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295603 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295610 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295618 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295625 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295632 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295640 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295651 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295660 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295673 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295681 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295689 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.295697 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296560 4834 flags.go:64] FLAG: --address="0.0.0.0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296586 4834 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296604 4834 flags.go:64] FLAG: --anonymous-auth="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296615 4834 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296626 4834 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296635 4834 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296646 4834 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296657 4834 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296667 4834 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296676 4834 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296686 4834 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296695 4834 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296704 4834 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296713 4834 flags.go:64] FLAG: --cgroup-root="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296721 4834 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296730 4834 flags.go:64] FLAG: --client-ca-file="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296739 4834 flags.go:64] FLAG: --cloud-config="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296748 4834 flags.go:64] FLAG: --cloud-provider="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296756 4834 flags.go:64] FLAG: --cluster-dns="[]" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296766 4834 flags.go:64] FLAG: --cluster-domain="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296775 4834 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296786 4834 flags.go:64] FLAG: --config-dir="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296795 4834 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296805 4834 flags.go:64] FLAG: --container-log-max-files="5" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296817 4834 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296826 4834 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296835 4834 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296844 4834 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296852 4834 flags.go:64] FLAG: --contention-profiling="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296861 4834 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296870 4834 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296879 4834 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296891 4834 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296901 4834 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296911 4834 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296919 4834 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296928 4834 flags.go:64] FLAG: --enable-load-reader="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296937 4834 flags.go:64] FLAG: --enable-server="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296946 4834 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296957 4834 flags.go:64] FLAG: --event-burst="100" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296967 4834 flags.go:64] FLAG: --event-qps="50" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296976 4834 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296985 4834 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.296993 4834 flags.go:64] FLAG: --eviction-hard="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297004 4834 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297013 4834 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297022 4834 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297032 4834 flags.go:64] FLAG: --eviction-soft="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297041 4834 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297049 4834 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297058 4834 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297067 4834 flags.go:64] FLAG: --experimental-mounter-path="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297077 4834 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297087 4834 flags.go:64] FLAG: --fail-swap-on="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297096 4834 flags.go:64] FLAG: --feature-gates="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297107 4834 flags.go:64] FLAG: --file-check-frequency="20s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297116 4834 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297126 4834 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297136 4834 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297145 4834 flags.go:64] FLAG: --healthz-port="10248" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297154 4834 flags.go:64] FLAG: --help="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297162 4834 flags.go:64] FLAG: --hostname-override="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297172 4834 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297182 4834 flags.go:64] FLAG: --http-check-frequency="20s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297191 4834 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297200 4834 flags.go:64] FLAG: --image-credential-provider-config="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297208 4834 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297217 4834 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297227 4834 flags.go:64] FLAG: --image-service-endpoint="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297236 4834 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297244 4834 flags.go:64] FLAG: --kube-api-burst="100" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297253 4834 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297263 4834 flags.go:64] FLAG: --kube-api-qps="50" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297272 4834 flags.go:64] FLAG: --kube-reserved="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297281 4834 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297291 4834 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297300 4834 flags.go:64] FLAG: --kubelet-cgroups="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297309 4834 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297318 4834 flags.go:64] FLAG: --lock-file="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297327 4834 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297336 4834 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297346 4834 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297358 4834 flags.go:64] FLAG: --log-json-split-stream="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297368 4834 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297376 4834 flags.go:64] FLAG: --log-text-split-stream="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297385 4834 flags.go:64] FLAG: --logging-format="text" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297427 4834 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297441 4834 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297453 4834 flags.go:64] FLAG: --manifest-url="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297462 4834 flags.go:64] FLAG: --manifest-url-header="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297474 4834 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297484 4834 flags.go:64] FLAG: --max-open-files="1000000" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297495 4834 flags.go:64] FLAG: --max-pods="110" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297504 4834 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297513 4834 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297523 4834 flags.go:64] FLAG: --memory-manager-policy="None" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297532 4834 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297541 4834 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297550 4834 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297559 4834 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297578 4834 flags.go:64] FLAG: --node-status-max-images="50" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297587 4834 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297596 4834 flags.go:64] FLAG: --oom-score-adj="-999" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297605 4834 flags.go:64] FLAG: --pod-cidr="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297615 4834 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297628 4834 flags.go:64] FLAG: --pod-manifest-path="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297637 4834 flags.go:64] FLAG: --pod-max-pids="-1" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297646 4834 flags.go:64] FLAG: --pods-per-core="0" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297656 4834 flags.go:64] FLAG: --port="10250" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297665 4834 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297674 4834 flags.go:64] FLAG: --provider-id="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297683 4834 flags.go:64] FLAG: --qos-reserved="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297691 4834 flags.go:64] FLAG: --read-only-port="10255" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297701 4834 flags.go:64] FLAG: --register-node="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297710 4834 flags.go:64] FLAG: --register-schedulable="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297719 4834 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297733 4834 flags.go:64] FLAG: --registry-burst="10" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297742 4834 flags.go:64] FLAG: --registry-qps="5" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297752 4834 flags.go:64] FLAG: --reserved-cpus="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297761 4834 flags.go:64] FLAG: --reserved-memory="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297772 4834 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297781 4834 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297790 4834 flags.go:64] FLAG: --rotate-certificates="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297799 4834 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297808 4834 flags.go:64] FLAG: --runonce="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297817 4834 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297826 4834 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297835 4834 flags.go:64] FLAG: --seccomp-default="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297844 4834 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297853 4834 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297862 4834 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297871 4834 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297881 4834 flags.go:64] FLAG: --storage-driver-password="root" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297890 4834 flags.go:64] FLAG: --storage-driver-secure="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297899 4834 flags.go:64] FLAG: --storage-driver-table="stats" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297907 4834 flags.go:64] FLAG: --storage-driver-user="root" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297917 4834 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297926 4834 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297935 4834 flags.go:64] FLAG: --system-cgroups="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297944 4834 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297958 4834 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297967 4834 flags.go:64] FLAG: --tls-cert-file="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297976 4834 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297986 4834 flags.go:64] FLAG: --tls-min-version="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.297996 4834 flags.go:64] FLAG: --tls-private-key-file="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298005 4834 flags.go:64] FLAG: --topology-manager-policy="none" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298013 4834 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298022 4834 flags.go:64] FLAG: --topology-manager-scope="container" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298032 4834 flags.go:64] FLAG: --v="2" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298043 4834 flags.go:64] FLAG: --version="false" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298056 4834 flags.go:64] FLAG: --vmodule="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298067 4834 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298076 4834 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298278 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298288 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298297 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298305 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298313 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298320 4834 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298328 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298336 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298344 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298351 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298359 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298366 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298375 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298385 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298424 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298435 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298445 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298454 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298462 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298471 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298478 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298486 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298494 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298504 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298512 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298520 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298528 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298536 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298544 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298552 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298560 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298567 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298575 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298585 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298595 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298603 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298612 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298621 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298629 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298637 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298646 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298653 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298661 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298669 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298677 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298684 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298692 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298699 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298707 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298715 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298723 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298736 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298744 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298754 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298763 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298773 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298782 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298790 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298797 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298806 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298815 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298825 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298836 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298844 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298852 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298860 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298868 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298875 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298883 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298890 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.298898 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.298921 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.309459 4834 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.309512 4834 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309624 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309636 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309642 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309648 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309654 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309659 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309664 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309669 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309675 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309680 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309685 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309690 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309694 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309699 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309704 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309709 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309714 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309719 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309724 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309730 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309734 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309740 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309746 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309753 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309759 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309764 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309769 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309774 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309779 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309784 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309789 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309794 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309799 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309803 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309810 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309814 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309819 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309824 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309829 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309833 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309838 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309844 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309848 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309853 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309858 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309863 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309867 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309872 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309877 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309882 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309887 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309892 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309897 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309904 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309911 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309917 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309922 4834 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309928 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309932 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309939 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309945 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309951 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309958 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309966 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309973 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309982 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309990 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.309996 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310001 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310006 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310013 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.310022 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310271 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310290 4834 feature_gate.go:330] unrecognized feature gate: Example Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310297 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310304 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310311 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310317 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310323 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310329 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310336 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310344 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310351 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310355 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310360 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310366 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310373 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310380 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310385 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310416 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310422 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310428 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310434 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310439 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310445 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310450 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310455 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310460 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310465 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310470 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310474 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310479 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310484 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310489 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310494 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310498 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310504 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310509 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310514 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310518 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310523 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310528 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310533 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310538 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310543 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310548 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310553 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310559 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310564 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310569 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310573 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310578 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310583 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310588 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310595 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310601 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310606 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310613 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310618 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310623 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310628 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310633 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310638 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310643 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310647 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310654 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310660 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310665 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310670 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310675 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310679 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310684 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.310690 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.310697 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.311631 4834 server.go:940] "Client rotation is on, will bootstrap in background" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.316973 4834 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.317111 4834 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.318824 4834 server.go:997] "Starting client certificate rotation" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.318861 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.319893 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 05:01:34.54731215 +0000 UTC Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.320004 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.342198 4834 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.347307 4834 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.350035 4834 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.363477 4834 log.go:25] "Validated CRI v1 runtime API" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.400479 4834 log.go:25] "Validated CRI v1 image API" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.402426 4834 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.407629 4834 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-30-21-09-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.407673 4834 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.434140 4834 manager.go:217] Machine: {Timestamp:2026-01-30 21:15:49.430332151 +0000 UTC m=+0.583478359 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a8c42df5-e7c6-43f3-b21d-2acb5110253c BootID:b49f675e-147a-40a2-ab31-7b9d1f2d710c Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:7e:c1:b1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:7e:c1:b1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a8:14:30 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:95:fc:79 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c2:97:7e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:dc:a6:c1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7e:2e:c5:5d:5c:87 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:4b:80:34:7b:87 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.434537 4834 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.434710 4834 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.436997 4834 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.437279 4834 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.437334 4834 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.437819 4834 topology_manager.go:138] "Creating topology manager with none policy" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.437839 4834 container_manager_linux.go:303] "Creating device plugin manager" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.438512 4834 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.438563 4834 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.438788 4834 state_mem.go:36] "Initialized new in-memory state store" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.438944 4834 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.443056 4834 kubelet.go:418] "Attempting to sync node with API server" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.443100 4834 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.443149 4834 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.443169 4834 kubelet.go:324] "Adding apiserver pod source" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.443187 4834 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.449132 4834 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.450272 4834 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.450872 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.450891 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.451047 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.451044 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.451879 4834 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453736 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453782 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453798 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453811 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453833 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453845 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453858 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453879 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453895 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453911 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453928 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.453941 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.456646 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.460869 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.460983 4834 server.go:1280] "Started kubelet" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.462385 4834 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.462361 4834 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 30 21:15:49 crc systemd[1]: Started Kubernetes Kubelet. Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.463687 4834 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.464009 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.464054 4834 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.464338 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 11:44:30.605164009 +0000 UTC Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.464535 4834 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.464564 4834 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.464758 4834 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.465964 4834 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.466774 4834 server.go:460] "Adding debug handlers to kubelet server" Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.467120 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.467229 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.467372 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.467615 4834 factory.go:55] Registering systemd factory Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.467659 4834 factory.go:221] Registration of the systemd container factory successfully Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.468091 4834 factory.go:153] Registering CRI-O factory Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.468134 4834 factory.go:221] Registration of the crio container factory successfully Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.468287 4834 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.468343 4834 factory.go:103] Registering Raw factory Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.468378 4834 manager.go:1196] Started watching for new ooms in manager Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.469501 4834 manager.go:319] Starting recovery of all containers Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.470620 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188f9ec5735a40d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:15:49.460934872 +0000 UTC m=+0.614081040,LastTimestamp:2026-01-30 21:15:49.460934872 +0000 UTC m=+0.614081040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.488846 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.489698 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.489721 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.489739 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.489753 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.489767 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491793 4834 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491823 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491838 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491852 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491864 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491875 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491885 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491897 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491909 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491920 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491930 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491961 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491972 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491985 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.491998 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492009 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492021 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492035 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492049 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492079 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492091 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492107 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492123 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492134 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492147 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492163 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492178 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492235 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492253 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492265 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492277 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492289 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492302 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492316 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492328 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492353 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492364 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492377 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492413 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492428 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492441 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492465 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492477 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492506 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492519 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492532 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492545 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492562 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492585 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492597 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492610 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492640 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492652 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492666 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492679 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492689 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492704 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492730 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492743 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492768 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492780 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492795 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492807 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492818 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492830 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492841 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492852 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492881 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492894 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492906 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492918 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492942 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492953 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492964 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.492976 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493002 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493013 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493025 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493037 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493048 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493060 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493073 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493085 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493113 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493125 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493137 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493148 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493160 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493171 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493182 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493194 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493219 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493232 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493245 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493256 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493268 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493279 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493291 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493303 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493377 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493402 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493414 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493427 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493440 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493452 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493488 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493500 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493522 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493533 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493545 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493556 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493568 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493580 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493594 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493604 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493636 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493649 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493664 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493678 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493693 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493707 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493718 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493730 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493765 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493778 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493790 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493801 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493811 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493823 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493834 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493845 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493866 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493879 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493890 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493902 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493914 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493926 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493938 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493950 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493979 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.493990 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494001 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494012 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494047 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494062 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494072 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494083 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494105 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494116 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494128 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494139 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494150 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494161 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494176 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494190 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494220 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494234 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494250 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494268 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494284 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494299 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494314 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494331 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494370 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494385 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494414 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494430 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494449 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494463 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494476 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494490 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494620 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494638 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494653 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494668 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494681 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494694 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494706 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494718 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494747 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494762 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494800 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494833 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494848 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494862 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494875 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494888 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494916 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494929 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494953 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494964 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494976 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.494989 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495001 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495012 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495034 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495044 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495056 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495067 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495079 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495089 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495100 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495109 4834 reconstruct.go:97] "Volume reconstruction finished" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.495117 4834 reconciler.go:26] "Reconciler: start to sync state" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.507575 4834 manager.go:324] Recovery completed Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.520865 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.522995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.523038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.523063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.523967 4834 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.523992 4834 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.524013 4834 state_mem.go:36] "Initialized new in-memory state store" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.527351 4834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.529589 4834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.529649 4834 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.529687 4834 kubelet.go:2335] "Starting kubelet main sync loop" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.530041 4834 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 30 21:15:49 crc kubenswrapper[4834]: W0130 21:15:49.530876 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.530986 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.543501 4834 policy_none.go:49] "None policy: Start" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.544676 4834 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.544711 4834 state_mem.go:35] "Initializing new in-memory state store" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.566189 4834 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.603633 4834 manager.go:334] "Starting Device Plugin manager" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.603724 4834 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.603751 4834 server.go:79] "Starting device plugin registration server" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.604652 4834 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.604684 4834 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.605224 4834 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.605304 4834 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.605311 4834 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.615109 4834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.630301 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.630510 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.633082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.633130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.633142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.633366 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.633757 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.633873 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.634992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635374 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635666 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.635747 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.636652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.636704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.636726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.636708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.636809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.636821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.637006 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.637037 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.637080 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638319 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638506 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638638 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.638681 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639465 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639741 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.639760 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.640853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.640872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.640880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.668874 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697593 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697700 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697732 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697768 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697796 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697832 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.697956 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698076 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698152 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698178 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698230 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698289 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.698325 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.704870 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.706351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.706454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.706474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.706539 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:15:49 crc kubenswrapper[4834]: E0130 21:15:49.707048 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799245 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799361 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799391 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799449 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799480 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799509 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799546 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799550 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799643 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799641 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799656 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799583 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799589 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799723 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799648 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799749 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799838 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799931 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.799963 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800074 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800232 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800230 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800311 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800342 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.800435 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.907901 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.962472 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 30 21:15:49 crc kubenswrapper[4834]: I0130 21:15:49.987175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.011629 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.125519 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.125752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.125805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.125817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.125847 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.125887 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.126207 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.126493 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.181432 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-75b2619e9150c36cc94fe71cb1daf1494d026aaef89703e3567f0a17fb7e4afd WatchSource:0}: Error finding container 75b2619e9150c36cc94fe71cb1daf1494d026aaef89703e3567f0a17fb7e4afd: Status 404 returned error can't find the container with id 75b2619e9150c36cc94fe71cb1daf1494d026aaef89703e3567f0a17fb7e4afd Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.181868 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-17f6f9ed8c5c19c378bcdebb35db52451ae4f3de1433799d2ddc3abd55b79dd8 WatchSource:0}: Error finding container 17f6f9ed8c5c19c378bcdebb35db52451ae4f3de1433799d2ddc3abd55b79dd8: Status 404 returned error can't find the container with id 17f6f9ed8c5c19c378bcdebb35db52451ae4f3de1433799d2ddc3abd55b79dd8 Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.183948 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-87c04482016e43055bd2e3076ab5e6d2614c1913830b33c55196031d14d304ec WatchSource:0}: Error finding container 87c04482016e43055bd2e3076ab5e6d2614c1913830b33c55196031d14d304ec: Status 404 returned error can't find the container with id 87c04482016e43055bd2e3076ab5e6d2614c1913830b33c55196031d14d304ec Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.187183 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ac9762d729319ca1d01c83fa2706a641550f52f9eb64f7a83d123d54cc41439d WatchSource:0}: Error finding container ac9762d729319ca1d01c83fa2706a641550f52f9eb64f7a83d123d54cc41439d: Status 404 returned error can't find the container with id ac9762d729319ca1d01c83fa2706a641550f52f9eb64f7a83d123d54cc41439d Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.189311 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7287f53d50b0608646d2913fdd6bff5db0e95564c2d60c32db801d8690260641 WatchSource:0}: Error finding container 7287f53d50b0608646d2913fdd6bff5db0e95564c2d60c32db801d8690260641: Status 404 returned error can't find the container with id 7287f53d50b0608646d2913fdd6bff5db0e95564c2d60c32db801d8690260641 Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.314201 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.314276 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.442096 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.442227 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.462178 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.465366 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 21:53:16.287843644 +0000 UTC Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.527634 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.530467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.530534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.530553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.530593 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.531446 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.536052 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac9762d729319ca1d01c83fa2706a641550f52f9eb64f7a83d123d54cc41439d"} Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.537514 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"87c04482016e43055bd2e3076ab5e6d2614c1913830b33c55196031d14d304ec"} Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.538861 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"75b2619e9150c36cc94fe71cb1daf1494d026aaef89703e3567f0a17fb7e4afd"} Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.540429 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7287f53d50b0608646d2913fdd6bff5db0e95564c2d60c32db801d8690260641"} Jan 30 21:15:50 crc kubenswrapper[4834]: I0130 21:15:50.541463 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"17f6f9ed8c5c19c378bcdebb35db52451ae4f3de1433799d2ddc3abd55b79dd8"} Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.627932 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.628029 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:50 crc kubenswrapper[4834]: W0130 21:15:50.824107 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.824859 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:50 crc kubenswrapper[4834]: E0130 21:15:50.928350 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.332850 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.336065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.336134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.336149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.336185 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:15:51 crc kubenswrapper[4834]: E0130 21:15:51.336910 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.462588 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.465749 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:52:41.158144405 +0000 UTC Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.471972 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 21:15:51 crc kubenswrapper[4834]: E0130 21:15:51.472906 4834 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.547419 4834 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535" exitCode=0 Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.547504 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.547608 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.549162 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.549221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.549241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.550262 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695" exitCode=0 Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.550339 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.550455 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.551908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.551944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.551958 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.554213 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365" exitCode=0 Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.554374 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.554379 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.556189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.556231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.556244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.557092 4834 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="001a6fdb4b5739ccaf6c10f173348ba534254f812c801f5fc2ab468f1fa1ae22" exitCode=0 Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.557549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"001a6fdb4b5739ccaf6c10f173348ba534254f812c801f5fc2ab468f1fa1ae22"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.557589 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.558831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.558860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.558869 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.560381 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.560423 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.560436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1"} Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.560900 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.561795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.561829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:51 crc kubenswrapper[4834]: I0130 21:15:51.561838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4834]: W0130 21:15:52.189577 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:52 crc kubenswrapper[4834]: E0130 21:15:52.189675 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:52 crc kubenswrapper[4834]: W0130 21:15:52.408795 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:52 crc kubenswrapper[4834]: E0130 21:15:52.408897 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.461851 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.466115 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 07:58:50.656127999 +0000 UTC Jan 30 21:15:52 crc kubenswrapper[4834]: E0130 21:15:52.529530 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.565101 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf" exitCode=0 Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.565169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.565296 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.566267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.566306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.566322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.569786 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.569839 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.569859 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.569877 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.572528 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.572637 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"08947faea3efd2eaef883a8bae603edfb3110b6786a019858402cec1dbc58252"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.573457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.573489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.573502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.578179 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.578219 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.579706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.579731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.579740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.586758 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.586783 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.586794 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3"} Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.586861 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.587468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.587489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.587497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4834]: W0130 21:15:52.806620 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.106:6443: connect: connection refused Jan 30 21:15:52 crc kubenswrapper[4834]: E0130 21:15:52.806680 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.106:6443: connect: connection refused" logger="UnhandledError" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.934234 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.937447 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.938884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.938940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.938957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:52 crc kubenswrapper[4834]: I0130 21:15:52.938985 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:15:52 crc kubenswrapper[4834]: E0130 21:15:52.939414 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.106:6443: connect: connection refused" node="crc" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.466823 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 23:28:17.038542802 +0000 UTC Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.593548 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9" exitCode=0 Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.593675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9"} Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.594250 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.595623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.595805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.595947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.596715 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.596780 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.596832 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.596722 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.597549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39"} Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.601956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.602004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.602026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.602830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.602903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:53 crc kubenswrapper[4834]: I0130 21:15:53.603000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.468016 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 23:27:24.456017489 +0000 UTC Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.605914 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.606300 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38"} Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.606364 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8"} Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.606387 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0"} Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.606479 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.606550 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.607321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.607354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.607364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.607733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.607781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:54 crc kubenswrapper[4834]: I0130 21:15:54.607796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.321493 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.386813 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.468981 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 12:36:19.145920735 +0000 UTC Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.606158 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.606344 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.607763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.607813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.607830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.615003 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.615020 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6"} Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.615065 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.615110 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.615063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af"} Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.616444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.616495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.616513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.616458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.616601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.616632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:55 crc kubenswrapper[4834]: I0130 21:15:55.684587 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.139953 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.141556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.141633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.141647 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.141687 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.470162 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:51:21.492627407 +0000 UTC Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.617611 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.617679 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.617697 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.618855 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.618933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.618954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.619576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.619606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:56 crc kubenswrapper[4834]: I0130 21:15:56.619617 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:57 crc kubenswrapper[4834]: I0130 21:15:57.289224 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:15:57 crc kubenswrapper[4834]: I0130 21:15:57.470726 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:34:48.531470347 +0000 UTC Jan 30 21:15:57 crc kubenswrapper[4834]: I0130 21:15:57.620890 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:57 crc kubenswrapper[4834]: I0130 21:15:57.622550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:57 crc kubenswrapper[4834]: I0130 21:15:57.622614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:57 crc kubenswrapper[4834]: I0130 21:15:57.622631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4834]: I0130 21:15:58.176204 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 30 21:15:58 crc kubenswrapper[4834]: I0130 21:15:58.176466 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:58 crc kubenswrapper[4834]: I0130 21:15:58.178026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:58 crc kubenswrapper[4834]: I0130 21:15:58.178108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:58 crc kubenswrapper[4834]: I0130 21:15:58.178128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:58 crc kubenswrapper[4834]: I0130 21:15:58.471740 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 22:16:12.500402278 +0000 UTC Jan 30 21:15:59 crc kubenswrapper[4834]: I0130 21:15:59.040161 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 30 21:15:59 crc kubenswrapper[4834]: I0130 21:15:59.040571 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:15:59 crc kubenswrapper[4834]: I0130 21:15:59.042476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:15:59 crc kubenswrapper[4834]: I0130 21:15:59.042517 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:15:59 crc kubenswrapper[4834]: I0130 21:15:59.042531 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:15:59 crc kubenswrapper[4834]: I0130 21:15:59.472065 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 10:02:04.417723686 +0000 UTC Jan 30 21:15:59 crc kubenswrapper[4834]: E0130 21:15:59.615280 4834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 30 21:16:00 crc kubenswrapper[4834]: I0130 21:16:00.473132 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 23:42:09.978101385 +0000 UTC Jan 30 21:16:00 crc kubenswrapper[4834]: I0130 21:16:00.680518 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:00 crc kubenswrapper[4834]: I0130 21:16:00.680700 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:00 crc kubenswrapper[4834]: I0130 21:16:00.681825 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:00 crc kubenswrapper[4834]: I0130 21:16:00.681861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:00 crc kubenswrapper[4834]: I0130 21:16:00.681873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.460590 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.469069 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.473991 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 23:16:51.62794456 +0000 UTC Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.632802 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.634330 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.634426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.634448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:01 crc kubenswrapper[4834]: I0130 21:16:01.643639 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:02 crc kubenswrapper[4834]: I0130 21:16:02.474587 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 23:30:48.690855869 +0000 UTC Jan 30 21:16:02 crc kubenswrapper[4834]: I0130 21:16:02.506114 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:02 crc kubenswrapper[4834]: I0130 21:16:02.635261 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:02 crc kubenswrapper[4834]: I0130 21:16:02.636250 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:02 crc kubenswrapper[4834]: I0130 21:16:02.636374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:02 crc kubenswrapper[4834]: I0130 21:16:02.636479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:03 crc kubenswrapper[4834]: W0130 21:16:03.350367 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.350708 4834 trace.go:236] Trace[815996132]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:15:53.348) (total time: 10002ms): Jan 30 21:16:03 crc kubenswrapper[4834]: Trace[815996132]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (21:16:03.350) Jan 30 21:16:03 crc kubenswrapper[4834]: Trace[815996132]: [10.002022951s] [10.002022951s] END Jan 30 21:16:03 crc kubenswrapper[4834]: E0130 21:16:03.350840 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.463429 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.474905 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 15:07:11.835764913 +0000 UTC Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.545297 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.545380 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.554001 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.554376 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.638148 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.640158 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.640204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:03 crc kubenswrapper[4834]: I0130 21:16:03.640224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:04 crc kubenswrapper[4834]: I0130 21:16:04.475075 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 04:53:56.493019132 +0000 UTC Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.327833 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.328067 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.329431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.329487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.329507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.335537 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.475896 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 17:07:40.621158926 +0000 UTC Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.506609 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.506757 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.643381 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.644593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.644628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:05 crc kubenswrapper[4834]: I0130 21:16:05.644640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:06 crc kubenswrapper[4834]: I0130 21:16:06.476062 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:26:55.010363588 +0000 UTC Jan 30 21:16:07 crc kubenswrapper[4834]: I0130 21:16:07.477174 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:39:12.846327437 +0000 UTC Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.209697 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.210219 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.212008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.212077 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.212090 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.225749 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.478266 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 07:46:52.479874485 +0000 UTC Jan 30 21:16:08 crc kubenswrapper[4834]: E0130 21:16:08.534918 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.542718 4834 trace.go:236] Trace[1299278684]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:15:57.862) (total time: 10680ms): Jan 30 21:16:08 crc kubenswrapper[4834]: Trace[1299278684]: ---"Objects listed" error: 10680ms (21:16:08.542) Jan 30 21:16:08 crc kubenswrapper[4834]: Trace[1299278684]: [10.680340769s] [10.680340769s] END Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.542771 4834 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:08 crc kubenswrapper[4834]: E0130 21:16:08.544484 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.545424 4834 trace.go:236] Trace[1447699839]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:15:56.256) (total time: 12288ms): Jan 30 21:16:08 crc kubenswrapper[4834]: Trace[1447699839]: ---"Objects listed" error: 12288ms (21:16:08.545) Jan 30 21:16:08 crc kubenswrapper[4834]: Trace[1447699839]: [12.288374303s] [12.288374303s] END Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.545461 4834 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.545610 4834 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.545664 4834 trace.go:236] Trace[1885218924]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Jan-2026 21:15:56.254) (total time: 12291ms): Jan 30 21:16:08 crc kubenswrapper[4834]: Trace[1885218924]: ---"Objects listed" error: 12291ms (21:16:08.545) Jan 30 21:16:08 crc kubenswrapper[4834]: Trace[1885218924]: [12.291388048s] [12.291388048s] END Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.545692 4834 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.549485 4834 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.551105 4834 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.589875 4834 csr.go:261] certificate signing request csr-ndq9f is approved, waiting to be issued Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.604325 4834 csr.go:257] certificate signing request csr-ndq9f is issued Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.685907 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.685969 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.689642 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37042->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.689704 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:37042->192.168.126.11:17697: read: connection reset by peer" Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.689983 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 21:16:08 crc kubenswrapper[4834]: I0130 21:16:08.690013 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.224326 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.224457 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.318245 4834 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 30 21:16:09 crc kubenswrapper[4834]: W0130 21:16:09.318615 4834 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 30 21:16:09 crc kubenswrapper[4834]: W0130 21:16:09.318657 4834 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 30 21:16:09 crc kubenswrapper[4834]: W0130 21:16:09.318625 4834 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 30 21:16:09 crc kubenswrapper[4834]: W0130 21:16:09.318734 4834 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.318716 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.106:51670->38.102.83.106:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.188f9ec59ea8d2ee openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:15:50.187504366 +0000 UTC m=+1.340650514,LastTimestamp:2026-01-30 21:15:50.187504366 +0000 UTC m=+1.340650514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.452658 4834 apiserver.go:52] "Watching apiserver" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.462022 4834 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.462484 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.462969 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.463011 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.463045 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.463383 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.463578 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.463582 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.463688 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.463812 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.463987 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.466588 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.466990 4834 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.467515 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.467564 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.467583 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.467602 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.467666 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.468255 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.469155 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.470562 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.478746 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:05:12.260371011 +0000 UTC Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.510275 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.531706 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.553856 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.553953 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554012 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554065 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554114 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554158 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554208 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554262 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554317 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554368 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554473 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554521 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554572 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554628 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554688 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554736 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554790 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554860 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.554990 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555003 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555103 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555156 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555179 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555206 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555355 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555387 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555430 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555455 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555526 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555554 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555580 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555604 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555636 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555661 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555682 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555708 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555733 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555764 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555791 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555812 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555831 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555853 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555875 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555906 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555910 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555933 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555988 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.555988 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556019 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556051 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556076 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556098 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556148 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556173 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556203 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556230 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556259 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556283 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556306 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556363 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556386 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556428 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556479 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556505 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556530 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556556 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556601 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556625 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556650 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556672 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556700 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556722 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556777 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556800 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556850 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556874 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556897 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556920 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556946 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556965 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556989 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557014 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557037 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557059 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557085 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557113 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557136 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557157 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557179 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557202 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557223 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557245 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557269 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557291 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557313 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557335 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557362 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557385 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557428 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557450 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557483 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557510 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557555 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557580 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557644 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557673 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557698 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557721 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557744 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557771 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557795 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557819 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557844 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557871 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557897 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557922 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557958 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557981 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558002 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558030 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558057 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558081 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558108 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558136 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558270 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558300 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558325 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558351 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558379 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558513 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560639 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560698 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560725 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560769 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563096 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563148 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563174 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563208 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563283 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563328 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563358 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563383 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563425 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563455 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563480 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563508 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563603 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563641 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563667 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563690 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563715 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563736 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563765 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563810 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563832 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563855 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563878 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563901 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563927 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563952 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563976 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563999 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564024 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564058 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564080 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564103 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564128 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564154 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564179 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564205 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564233 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564255 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564313 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564358 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564382 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564422 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564446 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564475 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564501 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564528 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564556 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564584 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564644 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564674 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564733 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564761 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564817 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564844 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564877 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564962 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565057 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565155 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565212 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565244 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565270 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565295 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565322 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565378 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565426 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565453 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565484 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565600 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565617 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565633 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565648 4834 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.569210 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556299 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.569712 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556337 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556624 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556852 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.556957 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557051 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557127 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557322 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557336 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557662 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557676 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.557873 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558000 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558078 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558289 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.558960 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.559097 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.559203 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.559520 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.559654 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.559923 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560040 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560118 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560245 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.560439 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561158 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561176 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561213 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561210 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561235 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561702 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561804 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.561982 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562013 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562144 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562234 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562421 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562541 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562743 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.562892 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563562 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563700 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.563877 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564150 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564182 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564806 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.564872 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.565966 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.566274 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.566584 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.566737 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.566797 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.567378 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.568205 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.568505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.568599 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.568633 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.568819 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.568994 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.569382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.569722 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.570789 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.571834 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.571907 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.572232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.572371 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.572921 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.573291 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.573636 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.573687 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.573691 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.574178 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.574438 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.574636 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.574766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.575094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.574953 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.575382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.575825 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.576171 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.576239 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.576480 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.576641 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:10.076612049 +0000 UTC m=+21.229758187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.576667 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.577244 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.577301 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.577468 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.579186 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.579842 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.580540 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.580557 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.580826 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.581075 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.581263 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.581782 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.581911 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.581935 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.582612 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.583609 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.583882 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.584281 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.584766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.585079 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.585134 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.585156 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.586422 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.587180 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.587966 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.588147 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.588667 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.588956 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.589515 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.589551 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.589912 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.590635 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.590627 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.591457 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.591472 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.591656 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.591911 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.592018 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.592035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.592031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.592083 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.592300 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.592559 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.593183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.594626 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.594711 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.594744 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595046 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595195 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595128 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595369 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595427 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595755 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.596073 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.595502 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.596537 4834 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.596653 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.597104 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.597522 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.597856 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.598047 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.598761 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.599068 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.599100 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.600043 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.600437 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.601536 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.602267 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.602720 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.602946 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.622083 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.603075 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.603936 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.604302 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.604630 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.607021 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.610333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.611885 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.612029 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.612111 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.612269 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.622316 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.622344 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.622468 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:10.122439082 +0000 UTC m=+21.275585420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.612428 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.622492 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.611871 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.618313 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.618434 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.618541 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.618775 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.621367 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.621719 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.623356 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.621781 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-30 21:11:08 +0000 UTC, rotation deadline is 2026-12-08 02:24:55.190799836 +0000 UTC Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.623544 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7469h8m45.567260153s for next certificate rotation Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.623823 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.624805 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.624887 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:10.124865341 +0000 UTC m=+21.278011679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.624953 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.625112 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.625162 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:10.125152739 +0000 UTC m=+21.278299097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.625727 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.626167 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.626243 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.626332 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.626567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.626625 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.626933 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.627537 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.628635 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.628658 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.628758 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.629446 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.630307 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.634686 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.637489 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.637650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.638165 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.641585 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.645023 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.647165 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.654808 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.654857 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.654879 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:09 crc kubenswrapper[4834]: E0130 21:16:09.654997 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:10.154935179 +0000 UTC m=+21.308081317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.655851 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669197 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669259 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669341 4834 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669353 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669363 4834 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669373 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669383 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669412 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669426 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669438 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669449 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669458 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669467 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669475 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669484 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669493 4834 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669502 4834 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669511 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669519 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669530 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669539 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669549 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669559 4834 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669567 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669577 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669585 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669594 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669602 4834 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669611 4834 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669619 4834 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669628 4834 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669639 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669648 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669658 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669668 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669678 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669687 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669696 4834 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669704 4834 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669712 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669720 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669728 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669736 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669744 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669752 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669760 4834 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669768 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669776 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669785 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669793 4834 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669801 4834 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669810 4834 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669818 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669826 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669835 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669844 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669852 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669861 4834 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669869 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669877 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669886 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669895 4834 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669903 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669913 4834 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669923 4834 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669931 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669940 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669949 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669957 4834 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669965 4834 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669973 4834 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669980 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669988 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.669997 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670005 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670012 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670022 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670030 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670038 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670045 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670053 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670090 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670100 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670108 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670117 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670126 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670134 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670143 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670152 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670160 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670168 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670176 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670184 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670192 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670200 4834 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670208 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670217 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670225 4834 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670233 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670240 4834 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670248 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670255 4834 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670266 4834 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670274 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670282 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670290 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670298 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670306 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670313 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670321 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670329 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670337 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670346 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670353 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670361 4834 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670369 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670378 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670387 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670411 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670420 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670431 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670440 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670449 4834 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670457 4834 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670466 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670474 4834 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670482 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670492 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670501 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670509 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670516 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670526 4834 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670534 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670543 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670553 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670563 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670574 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670584 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670593 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670601 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670611 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670619 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670628 4834 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670636 4834 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670645 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670655 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670664 4834 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670673 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670683 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670692 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670703 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670712 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670720 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670730 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670739 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670748 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670757 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670766 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670774 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670782 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670791 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670800 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670809 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670818 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670827 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670837 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670845 4834 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670854 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670862 4834 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670871 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670878 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670887 4834 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670895 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670904 4834 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670912 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670920 4834 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670928 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670938 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670947 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670956 4834 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670966 4834 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670974 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670982 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670990 4834 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.670998 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.671005 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.671013 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.671021 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.671029 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.671192 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.671234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.673275 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.673966 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.685196 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.686272 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.690615 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.692474 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.695241 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39" exitCode=255 Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.696036 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39"} Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.701525 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.707016 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.736845 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.763366 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.771905 4834 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.771938 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.771951 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.774775 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.785904 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.790908 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.791125 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 30 21:16:09 crc kubenswrapper[4834]: W0130 21:16:09.798040 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3283b632eef157ffa7a9fca6dca6affa388edde1b703353c685ed2ede6dda76c WatchSource:0}: Error finding container 3283b632eef157ffa7a9fca6dca6affa388edde1b703353c685ed2ede6dda76c: Status 404 returned error can't find the container with id 3283b632eef157ffa7a9fca6dca6affa388edde1b703353c685ed2ede6dda76c Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.800660 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.807913 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.808005 4834 scope.go:117] "RemoveContainer" containerID="170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.810597 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: W0130 21:16:09.814426 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-87e9703b3986accc81661fb98e3c2a67bff798e104834edfd5478715f77ccfd0 WatchSource:0}: Error finding container 87e9703b3986accc81661fb98e3c2a67bff798e104834edfd5478715f77ccfd0: Status 404 returned error can't find the container with id 87e9703b3986accc81661fb98e3c2a67bff798e104834edfd5478715f77ccfd0 Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.847808 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.867309 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.889668 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.978926 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-42cwb"] Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.979384 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.981633 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.982461 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.983691 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 21:16:09 crc kubenswrapper[4834]: I0130 21:16:09.994442 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.005911 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.023904 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.034152 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.043799 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.067332 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.074485 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pfs\" (UniqueName: \"kubernetes.io/projected/96ed93d8-d6ab-42f5-8c10-cfc941d1931e-kube-api-access-v7pfs\") pod \"node-resolver-42cwb\" (UID: \"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\") " pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.074520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96ed93d8-d6ab-42f5-8c10-cfc941d1931e-hosts-file\") pod \"node-resolver-42cwb\" (UID: \"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\") " pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.081192 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.090989 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.101078 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.175872 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176161 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:11.176104125 +0000 UTC m=+22.329250263 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176484 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176547 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pfs\" (UniqueName: \"kubernetes.io/projected/96ed93d8-d6ab-42f5-8c10-cfc941d1931e-kube-api-access-v7pfs\") pod \"node-resolver-42cwb\" (UID: \"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\") " pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96ed93d8-d6ab-42f5-8c10-cfc941d1931e-hosts-file\") pod \"node-resolver-42cwb\" (UID: \"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\") " pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176663 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176690 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176691 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176792 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:11.176771104 +0000 UTC m=+22.329917242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176860 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.176866 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96ed93d8-d6ab-42f5-8c10-cfc941d1931e-hosts-file\") pod \"node-resolver-42cwb\" (UID: \"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\") " pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176877 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176903 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176919 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176931 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176946 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176961 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.176993 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:11.176980729 +0000 UTC m=+22.330126867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.177008 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:11.17700184 +0000 UTC m=+22.330147978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:10 crc kubenswrapper[4834]: E0130 21:16:10.177025 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:11.17701686 +0000 UTC m=+22.330162998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.205152 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pfs\" (UniqueName: \"kubernetes.io/projected/96ed93d8-d6ab-42f5-8c10-cfc941d1931e-kube-api-access-v7pfs\") pod \"node-resolver-42cwb\" (UID: \"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\") " pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.294077 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-42cwb" Jan 30 21:16:10 crc kubenswrapper[4834]: W0130 21:16:10.307847 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96ed93d8_d6ab_42f5_8c10_cfc941d1931e.slice/crio-62baaf45b9f97ba169403cdbcf34c1ff67a2feb05650ba19f8bf2fb29d77b973 WatchSource:0}: Error finding container 62baaf45b9f97ba169403cdbcf34c1ff67a2feb05650ba19f8bf2fb29d77b973: Status 404 returned error can't find the container with id 62baaf45b9f97ba169403cdbcf34c1ff67a2feb05650ba19f8bf2fb29d77b973 Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.350548 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5655f"] Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.351682 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.352264 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-drghn"] Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.352552 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-j2m7n"] Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.352665 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.353808 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.355842 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.356799 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.357046 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.357281 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.357510 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358038 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358274 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358334 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358287 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358503 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358664 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.358944 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.370739 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.400534 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.412908 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.430566 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.442617 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.458215 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.475968 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.478839 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9h82\" (UniqueName: \"kubernetes.io/projected/64f88d18-0675-4d43-82c3-23acaafb56c4-kube-api-access-q9h82\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.478888 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-kubelet\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.478908 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/296cf2a5-374e-4730-9d40-8abb93c8e237-proxy-tls\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.478926 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-os-release\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.478945 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-hostroot\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.478963 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/64f88d18-0675-4d43-82c3-23acaafb56c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479038 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-conf-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479061 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-daemon-config\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479114 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-cnibin\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479133 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-cni-binary-copy\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479150 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-cni-multus\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479168 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64f88d18-0675-4d43-82c3-23acaafb56c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479200 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-etc-kubernetes\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-cni-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479288 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-netns\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479304 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-multus-certs\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479331 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw9kq\" (UniqueName: \"kubernetes.io/projected/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-kube-api-access-gw9kq\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479346 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-cnibin\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479376 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479416 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/296cf2a5-374e-4730-9d40-8abb93c8e237-rootfs\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479432 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/296cf2a5-374e-4730-9d40-8abb93c8e237-mcd-auth-proxy-config\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479458 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-system-cni-dir\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479490 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-k8s-cni-cncf-io\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-os-release\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479530 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-system-cni-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479547 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-socket-dir-parent\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479683 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-cni-bin\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.479733 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tdf\" (UniqueName: \"kubernetes.io/projected/296cf2a5-374e-4730-9d40-8abb93c8e237-kube-api-access-k4tdf\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.480071 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 00:50:15.681193714 +0000 UTC Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.511961 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.537637 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.556764 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.575840 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580328 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-k8s-cni-cncf-io\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-os-release\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580483 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-system-cni-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-socket-dir-parent\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580533 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-k8s-cni-cncf-io\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580601 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-cni-bin\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580551 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-cni-bin\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tdf\" (UniqueName: \"kubernetes.io/projected/296cf2a5-374e-4730-9d40-8abb93c8e237-kube-api-access-k4tdf\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580672 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9h82\" (UniqueName: \"kubernetes.io/projected/64f88d18-0675-4d43-82c3-23acaafb56c4-kube-api-access-q9h82\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-socket-dir-parent\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580719 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-kubelet\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580695 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-kubelet\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580748 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-system-cni-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580759 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/296cf2a5-374e-4730-9d40-8abb93c8e237-proxy-tls\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580808 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-os-release\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580831 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-hostroot\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580853 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/64f88d18-0675-4d43-82c3-23acaafb56c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580892 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-conf-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580909 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-daemon-config\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580944 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-cnibin\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-cni-binary-copy\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.580981 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-cni-multus\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581001 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64f88d18-0675-4d43-82c3-23acaafb56c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581023 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-etc-kubernetes\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581065 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-netns\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-multus-certs\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581067 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-os-release\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581104 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw9kq\" (UniqueName: \"kubernetes.io/projected/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-kube-api-access-gw9kq\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581213 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-cnibin\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581244 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-cnibin\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-cni-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581292 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581314 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/296cf2a5-374e-4730-9d40-8abb93c8e237-rootfs\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/296cf2a5-374e-4730-9d40-8abb93c8e237-mcd-auth-proxy-config\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581364 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-system-cni-dir\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581366 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-os-release\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581419 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-system-cni-dir\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581441 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-etc-kubernetes\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581470 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-netns\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581498 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-run-multus-certs\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-cni-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581643 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-hostroot\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581658 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/296cf2a5-374e-4730-9d40-8abb93c8e237-rootfs\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581670 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-conf-dir\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581710 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-cnibin\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581741 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-host-var-lib-cni-multus\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.581755 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-multus-daemon-config\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.582164 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-cni-binary-copy\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.582299 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/64f88d18-0675-4d43-82c3-23acaafb56c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.582371 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/296cf2a5-374e-4730-9d40-8abb93c8e237-mcd-auth-proxy-config\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.582425 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64f88d18-0675-4d43-82c3-23acaafb56c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.583176 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64f88d18-0675-4d43-82c3-23acaafb56c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.594056 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.604308 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9h82\" (UniqueName: \"kubernetes.io/projected/64f88d18-0675-4d43-82c3-23acaafb56c4-kube-api-access-q9h82\") pod \"multus-additional-cni-plugins-j2m7n\" (UID: \"64f88d18-0675-4d43-82c3-23acaafb56c4\") " pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.605412 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/296cf2a5-374e-4730-9d40-8abb93c8e237-proxy-tls\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.608950 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tdf\" (UniqueName: \"kubernetes.io/projected/296cf2a5-374e-4730-9d40-8abb93c8e237-kube-api-access-k4tdf\") pod \"machine-config-daemon-drghn\" (UID: \"296cf2a5-374e-4730-9d40-8abb93c8e237\") " pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.610858 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw9kq\" (UniqueName: \"kubernetes.io/projected/25f6f1cd-cd4b-475a-85a3-4e81cda5d203-kube-api-access-gw9kq\") pod \"multus-5655f\" (UID: \"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\") " pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.612458 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.630222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.645344 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.657979 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.666206 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.668258 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5655f" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.679438 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.680557 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:16:10 crc kubenswrapper[4834]: W0130 21:16:10.694328 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod296cf2a5_374e_4730_9d40_8abb93c8e237.slice/crio-a32ee0aa9f1218a3f14a1cd84b4fff4f93bfdb2becb8083e52248a7193ded726 WatchSource:0}: Error finding container a32ee0aa9f1218a3f14a1cd84b4fff4f93bfdb2becb8083e52248a7193ded726: Status 404 returned error can't find the container with id a32ee0aa9f1218a3f14a1cd84b4fff4f93bfdb2becb8083e52248a7193ded726 Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.696455 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.701905 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.703933 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerStarted","Data":"13a75829fc3d550459c7ff94ecb42b0b2622e05dfe1437144c930750b4ae4345"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.705750 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.705773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.705784 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3283b632eef157ffa7a9fca6dca6affa388edde1b703353c685ed2ede6dda76c"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.718602 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-42cwb" event={"ID":"96ed93d8-d6ab-42f5-8c10-cfc941d1931e","Type":"ContainerStarted","Data":"3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.718677 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-42cwb" event={"ID":"96ed93d8-d6ab-42f5-8c10-cfc941d1931e","Type":"ContainerStarted","Data":"62baaf45b9f97ba169403cdbcf34c1ff67a2feb05650ba19f8bf2fb29d77b973"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.719759 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.720056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"87e9703b3986accc81661fb98e3c2a67bff798e104834edfd5478715f77ccfd0"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.728297 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.728445 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"36ca85c551e616f5a260a67c80eadff4fbb7962b9d7763176b393e75a5a67a65"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.733035 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.738220 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.745670 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.746044 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.750547 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"a32ee0aa9f1218a3f14a1cd84b4fff4f93bfdb2becb8083e52248a7193ded726"} Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.757893 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.758812 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xmxm"] Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.765425 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.769689 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.770158 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.770745 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.770771 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.771054 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.771136 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.771208 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.781709 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.796085 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.808566 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.822898 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.839595 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.858264 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.872161 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.883669 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-slash\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.883729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-etc-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.883781 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-kubelet\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.883815 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-ovn\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.883892 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-env-overrides\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884176 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-netns\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884235 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-netd\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884281 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-log-socket\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884513 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1133254b-8923-414d-8031-4dfe81f17e12-ovn-node-metrics-cert\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-var-lib-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884670 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-script-lib\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884700 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-config\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884748 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884769 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkdbm\" (UniqueName: \"kubernetes.io/projected/1133254b-8923-414d-8031-4dfe81f17e12-kube-api-access-qkdbm\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884788 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884805 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-node-log\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884843 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-bin\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884859 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-systemd-units\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.884879 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-systemd\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.887097 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.901920 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.914504 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.929624 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.945902 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.967652 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.982745 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:10Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986309 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkdbm\" (UniqueName: \"kubernetes.io/projected/1133254b-8923-414d-8031-4dfe81f17e12-kube-api-access-qkdbm\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-systemd-units\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-node-log\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986386 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-bin\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986420 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-systemd\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986436 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-etc-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986452 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-slash\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-kubelet\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986519 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986586 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-bin\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986592 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-kubelet\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986545 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-etc-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986588 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-systemd\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986639 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-systemd-units\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986555 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-ovn\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986575 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-node-log\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-slash\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-ovn\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986756 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-env-overrides\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986831 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986906 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-netns\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986930 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-netd\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986968 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-log-socket\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.986972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987002 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1133254b-8923-414d-8031-4dfe81f17e12-ovn-node-metrics-cert\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987043 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-netd\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987054 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-var-lib-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987079 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-netns\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-script-lib\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987135 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-config\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987422 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-env-overrides\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-log-socket\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987958 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-var-lib-openvswitch\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.987963 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-script-lib\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.988142 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-config\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:10 crc kubenswrapper[4834]: I0130 21:16:10.993576 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1133254b-8923-414d-8031-4dfe81f17e12-ovn-node-metrics-cert\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.003875 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkdbm\" (UniqueName: \"kubernetes.io/projected/1133254b-8923-414d-8031-4dfe81f17e12-kube-api-access-qkdbm\") pod \"ovnkube-node-4xmxm\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.009700 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.023429 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.037928 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.050258 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.063155 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.077446 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.083274 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.093710 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: W0130 21:16:11.099120 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1133254b_8923_414d_8031_4dfe81f17e12.slice/crio-c955e224eef7499b97cb949ce5bdc2b397a08fdd06657708bd68499e884b1908 WatchSource:0}: Error finding container c955e224eef7499b97cb949ce5bdc2b397a08fdd06657708bd68499e884b1908: Status 404 returned error can't find the container with id c955e224eef7499b97cb949ce5bdc2b397a08fdd06657708bd68499e884b1908 Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.122955 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.170034 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.189073 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.189155 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.189185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.189208 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.189226 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189325 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189374 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:13.189360734 +0000 UTC m=+24.342506872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189444 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:13.189437417 +0000 UTC m=+24.342583555 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189515 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189526 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189536 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189558 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:13.18955084 +0000 UTC m=+24.342696978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189589 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189612 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:13.189606441 +0000 UTC m=+24.342752569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189655 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189665 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189671 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.189690 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:13.189683313 +0000 UTC m=+24.342829451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.208593 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.246327 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.480938 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:41:05.805703806 +0000 UTC Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.530691 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.530785 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.530834 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.530924 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.530839 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:11 crc kubenswrapper[4834]: E0130 21:16:11.531045 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.534749 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.535448 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.536626 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.537262 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.538439 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.539134 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.539741 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.540737 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.541340 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.542241 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.542797 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.543864 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.544355 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.544879 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.545780 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.546286 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.547480 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.547895 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.548464 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.549451 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.549943 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.551369 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.551949 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.553263 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.553850 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.554626 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.555790 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.556315 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.557296 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.557791 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.558637 4834 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.558734 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.560416 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.561323 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.561801 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.563291 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.563930 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.564954 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.565593 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.566696 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.567183 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.568160 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.568924 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.569847 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.570287 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.571151 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.571681 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.572765 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.573240 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.574069 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.574525 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.575406 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.575941 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.576411 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.755596 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.755639 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.757634 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerStarted","Data":"0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.759071 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3" exitCode=0 Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.759111 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.759125 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"c955e224eef7499b97cb949ce5bdc2b397a08fdd06657708bd68499e884b1908"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.760970 4834 generic.go:334] "Generic (PLEG): container finished" podID="64f88d18-0675-4d43-82c3-23acaafb56c4" containerID="490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b" exitCode=0 Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.761059 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerDied","Data":"490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.761116 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerStarted","Data":"592d0451b7093d1c0d0b0ce258c94090e8336e43747e5b6875a7a60088da5311"} Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.776028 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.798445 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.816881 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.830847 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.841865 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.852467 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.867362 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.876608 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.891323 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.907334 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.920320 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.945772 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.958022 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.972249 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:11 crc kubenswrapper[4834]: I0130 21:16:11.990497 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:11Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.004146 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.014597 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.032700 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.047460 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.063856 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.098484 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.122926 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.165703 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.205349 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.243493 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.284937 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.285713 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sqm85"] Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.286643 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.315441 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.338790 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.355990 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.375301 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.399844 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2qm9\" (UniqueName: \"kubernetes.io/projected/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-kube-api-access-l2qm9\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.399997 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-host\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.400082 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-serviceca\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.408231 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.445096 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.481388 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 00:31:32.835758061 +0000 UTC Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.484633 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.501440 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2qm9\" (UniqueName: \"kubernetes.io/projected/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-kube-api-access-l2qm9\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.501508 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-host\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.501531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-serviceca\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.502643 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-serviceca\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.502716 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-host\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.509470 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.516753 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.525180 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.554309 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2qm9\" (UniqueName: \"kubernetes.io/projected/3d3a851f-fc14-4b9c-b9c1-a92da4b27262-kube-api-access-l2qm9\") pod \"node-ca-sqm85\" (UID: \"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\") " pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.562313 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.598949 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sqm85" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.613666 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: W0130 21:16:12.614356 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3a851f_fc14_4b9c_b9c1_a92da4b27262.slice/crio-8d7e268a86d50a5e064dce445b074e518caf6f49308fa0b78ba6c7dfd1501109 WatchSource:0}: Error finding container 8d7e268a86d50a5e064dce445b074e518caf6f49308fa0b78ba6c7dfd1501109: Status 404 returned error can't find the container with id 8d7e268a86d50a5e064dce445b074e518caf6f49308fa0b78ba6c7dfd1501109 Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.647794 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.686457 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.724616 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.763803 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.774857 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.777007 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerStarted","Data":"50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.781644 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.781696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.781726 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.781739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.781748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.785704 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqm85" event={"ID":"3d3a851f-fc14-4b9c-b9c1-a92da4b27262","Type":"ContainerStarted","Data":"8d7e268a86d50a5e064dce445b074e518caf6f49308fa0b78ba6c7dfd1501109"} Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.802552 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: E0130 21:16:12.819024 4834 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.862493 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.905655 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:12 crc kubenswrapper[4834]: I0130 21:16:12.946783 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:12Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.008367 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.045783 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.066993 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.105218 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.145731 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.193569 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.208313 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.208581 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:17.208542699 +0000 UTC m=+28.361688887 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.208697 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.208796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.208876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.208960 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209038 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209096 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209141 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:17.209111425 +0000 UTC m=+28.362257603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209181 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:17.209158776 +0000 UTC m=+28.362304964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209223 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209274 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209292 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209388 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:17.209356802 +0000 UTC m=+28.362502940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209931 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.209987 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.210017 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.210204 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:17.210111673 +0000 UTC m=+28.363258001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.232201 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.266201 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.304459 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.344176 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.385748 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.424816 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.464254 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.481693 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 09:38:44.406836068 +0000 UTC Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.506503 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.530967 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.531118 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.530980 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.531239 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.531272 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:13 crc kubenswrapper[4834]: E0130 21:16:13.531444 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.546114 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.602642 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.794511 4834 generic.go:334] "Generic (PLEG): container finished" podID="64f88d18-0675-4d43-82c3-23acaafb56c4" containerID="50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96" exitCode=0 Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.794600 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerDied","Data":"50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96"} Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.802548 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8"} Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.805617 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sqm85" event={"ID":"3d3a851f-fc14-4b9c-b9c1-a92da4b27262","Type":"ContainerStarted","Data":"4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5"} Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.813278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.835634 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.858655 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.883249 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.898582 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.915109 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.930908 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.956959 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.976003 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:13 crc kubenswrapper[4834]: I0130 21:16:13.996181 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:13Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.028563 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.068528 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.112805 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.158386 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.184510 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.227938 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.281482 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.305862 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.350622 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.388133 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.433297 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.468738 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.482513 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 04:01:41.018872177 +0000 UTC Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.510277 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.545956 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.590974 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.628680 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.667622 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.707304 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.765045 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.786578 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.814012 4834 generic.go:334] "Generic (PLEG): container finished" podID="64f88d18-0675-4d43-82c3-23acaafb56c4" containerID="e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b" exitCode=0 Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.814107 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerDied","Data":"e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b"} Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.833988 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.868070 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.909813 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.944883 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.945665 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:14Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.947262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.947314 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.947333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:14 crc kubenswrapper[4834]: I0130 21:16:14.947559 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.005490 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.016317 4834 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.016654 4834 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.018054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.018105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.018123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.018144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.018162 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.042415 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.049113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.049176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.049193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.049224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.049242 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.068762 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.073562 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.076543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.076591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.076609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.076635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.076653 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.094366 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.099037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.099099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.099117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.099144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.099161 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.109536 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.117382 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.122489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.122530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.122545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.122572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.122588 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.143889 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.144117 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.146650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.146710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.146724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.146813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.146833 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.147989 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.192548 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.231429 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.251478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.251541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.251560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.251587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.251608 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.269764 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.309722 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.350865 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.355555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.355620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.355635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.355660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.355674 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.388193 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.441557 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.458799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.458849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.458866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.458889 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.458906 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.483628 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 15:22:42.790632398 +0000 UTC Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.530715 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.530727 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.531248 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.530796 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.531755 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:15 crc kubenswrapper[4834]: E0130 21:16:15.531994 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.562279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.562336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.562353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.562377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.562433 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.665270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.665320 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.665336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.665358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.665376 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.771826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.771882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.771902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.771929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.771950 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.843750 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.848169 4834 generic.go:334] "Generic (PLEG): container finished" podID="64f88d18-0675-4d43-82c3-23acaafb56c4" containerID="e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57" exitCode=0 Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.848217 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerDied","Data":"e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.871381 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.875678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.875747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.875767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.876120 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.876334 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.906717 4834 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.913869 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.936043 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.957327 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.973934 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.979557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.979607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.979634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.979663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:15 crc kubenswrapper[4834]: I0130 21:16:15.979682 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:15Z","lastTransitionTime":"2026-01-30T21:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.001383 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:15Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.028618 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.042325 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.062548 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.081093 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.092371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.092445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.092457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.092482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.092497 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.106282 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.128260 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.154408 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.173124 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.190612 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.196117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.196157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.196169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.196189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.196205 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.299350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.299468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.299495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.299525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.299544 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.403452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.403515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.403536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.403559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.403576 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.484675 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 07:08:46.074960979 +0000 UTC Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.506614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.506683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.506708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.506739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.506764 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.610767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.610838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.610859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.610885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.610903 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.714857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.714917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.714937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.714973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.715012 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.818375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.818478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.818495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.818522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.818542 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.863366 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerStarted","Data":"2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.886212 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.922097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.922148 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.922165 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.922200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.922217 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:16Z","lastTransitionTime":"2026-01-30T21:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.922717 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.944164 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.966670 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:16 crc kubenswrapper[4834]: I0130 21:16:16.988974 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:16Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.005287 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.025468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.025516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.025528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.025554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.025575 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.026981 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.052770 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.067089 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.080942 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.096165 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.128624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.129054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.129294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.129514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.129670 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.142196 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.172852 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.195721 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.210031 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.232879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.232924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.232937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.232955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.232969 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.254026 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.254361 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254448 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:25.254374607 +0000 UTC m=+36.407520905 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.254535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254557 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254660 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:25.254636744 +0000 UTC m=+36.407782902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.254698 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254738 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254768 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254805 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254832 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.254751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254845 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254963 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.254860 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:25.25484261 +0000 UTC m=+36.407988748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.255007 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.255051 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:25.255021275 +0000 UTC m=+36.408167543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.255105 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:25.255076157 +0000 UTC m=+36.408222345 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.336974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.337008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.337018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.337034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.337045 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.440497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.440536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.440546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.440561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.440573 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.485765 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 18:30:09.008963874 +0000 UTC Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.530134 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.530177 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.530378 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.530562 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.530731 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:17 crc kubenswrapper[4834]: E0130 21:16:17.530900 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.543847 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.543933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.543957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.543992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.544014 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.647992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.648485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.648562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.648634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.648718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.752457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.752777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.752904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.753013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.753116 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.856557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.856773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.856875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.856977 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.857099 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.870851 4834 generic.go:334] "Generic (PLEG): container finished" podID="64f88d18-0675-4d43-82c3-23acaafb56c4" containerID="2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19" exitCode=0 Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.870968 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerDied","Data":"2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.879203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.879870 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.880035 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.906308 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.915219 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.915322 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.931640 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.960048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.960098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.960116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.960144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.960164 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:17Z","lastTransitionTime":"2026-01-30T21:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.961849 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.978254 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:17 crc kubenswrapper[4834]: I0130 21:16:17.997448 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.015969 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.029202 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.040706 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.057489 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.062455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.062503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.062527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.062552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.062570 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.071173 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.084157 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.098363 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.113814 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.132013 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.150714 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.164886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.164951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.164969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.164997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.165016 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.177451 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.192938 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.205112 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.223199 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.254128 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.267979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.268024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.268037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.268056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.268069 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.272156 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.295121 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.315253 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.336061 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.353698 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.370460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.370523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.370564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.370590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.370609 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.373984 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.397793 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.421349 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.441377 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.455336 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.472750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.472810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.472827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.472849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.472865 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.485962 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:45:17.487656149 +0000 UTC Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.576258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.576319 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.576338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.576367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.576385 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.679546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.679596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.679615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.679640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.679659 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.782251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.782314 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.782354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.782386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.782441 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.884741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.884809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.884830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.884855 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.884874 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.888311 4834 generic.go:334] "Generic (PLEG): container finished" podID="64f88d18-0675-4d43-82c3-23acaafb56c4" containerID="279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf" exitCode=0 Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.888426 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerDied","Data":"279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.888542 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.905663 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.917926 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.932092 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.947183 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.962739 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.978821 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.988173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.988207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.988218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.988238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.988249 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:18Z","lastTransitionTime":"2026-01-30T21:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:18 crc kubenswrapper[4834]: I0130 21:16:18.996668 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:18Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.014166 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.039705 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.050313 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.065041 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.079414 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.091246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.091368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.091464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.091549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.091636 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.093812 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.113695 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.132956 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.194214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.194257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.194269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.194287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.194301 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.297356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.297481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.297501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.297527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.297545 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.400957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.401027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.401046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.401077 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.401101 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.486757 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 23:11:07.607481193 +0000 UTC Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.505059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.505136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.505156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.505188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.505210 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.530668 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.530742 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.530676 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:19 crc kubenswrapper[4834]: E0130 21:16:19.530871 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:19 crc kubenswrapper[4834]: E0130 21:16:19.530997 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:19 crc kubenswrapper[4834]: E0130 21:16:19.531068 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.568085 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.587617 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.607954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.608013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.608028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.608050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.608066 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.615437 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.641846 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.660706 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.676980 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.687154 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.701461 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.710335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.710368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.710379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.710411 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.710425 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.717700 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.729936 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.741958 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.752901 4834 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.757479 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.776794 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.792907 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.807016 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.812645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.812702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.812719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.812741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.812759 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.825192 4834 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.895459 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" event={"ID":"64f88d18-0675-4d43-82c3-23acaafb56c4","Type":"ContainerStarted","Data":"8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.895560 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.908319 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.915830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.915904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.915923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.915950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.915965 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:19Z","lastTransitionTime":"2026-01-30T21:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.921645 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.935632 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.951468 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.968746 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:19 crc kubenswrapper[4834]: I0130 21:16:19.980971 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.004195 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.019283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.019364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.019385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.019448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.019470 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.026585 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.058969 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.080544 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.106173 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.122824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.122892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.122912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.122937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.122956 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.126912 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.139953 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.155719 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.173831 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.226682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.226806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.226833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.226862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.226883 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.329347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.329444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.329461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.329484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.329501 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.432582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.432655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.432680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.432710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.432732 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.487113 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 16:02:31.677729872 +0000 UTC Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.536158 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.536210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.536223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.536252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.536268 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.640300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.640367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.640390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.640443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.640461 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.743943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.744070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.744091 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.744114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.744133 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.847782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.847841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.847854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.847875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.847889 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.901599 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/0.log" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.905818 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f" exitCode=1 Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.905871 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.906777 4834 scope.go:117] "RemoveContainer" containerID="23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.930016 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.951654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.951717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.951734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.951762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.951783 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:20Z","lastTransitionTime":"2026-01-30T21:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.955586 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.975027 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:20 crc kubenswrapper[4834]: I0130 21:16:20.993062 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:20Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.019835 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.051264 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.054055 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.054082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.054093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.054109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.054121 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.071197 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.096858 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.124133 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.146632 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.156664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.156706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.156723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.156748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.156766 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.177771 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.193335 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.206454 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.239380 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.258446 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.260306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.260335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.260345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.260363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.260373 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.362853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.362913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.362932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.362958 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.362975 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.466085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.466125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.466137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.466159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.466170 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.487962 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:56:53.758823969 +0000 UTC Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.530538 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.530541 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:21 crc kubenswrapper[4834]: E0130 21:16:21.530683 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:21 crc kubenswrapper[4834]: E0130 21:16:21.530885 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.531046 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:21 crc kubenswrapper[4834]: E0130 21:16:21.531268 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.568930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.568970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.568982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.568999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.569011 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.672284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.672322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.672334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.672349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.672361 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.717424 4834 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.775764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.775878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.775900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.775927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.775946 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.879565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.879627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.879646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.879684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.879703 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.912796 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/0.log" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.916975 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.923477 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.949154 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.972843 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.982607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.982947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.983100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.983240 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.983438 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:21Z","lastTransitionTime":"2026-01-30T21:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:21 crc kubenswrapper[4834]: I0130 21:16:21.994566 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:21Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.010531 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.023727 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.035916 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.058139 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.074710 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.086108 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.086160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.086174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.086194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.086208 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.089256 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.103849 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.125768 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.140382 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.156224 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.189658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.189749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.189776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.189808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.189832 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.192967 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.213019 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.252697 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj"] Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.253562 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.255499 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.256170 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.275284 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.291706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.291770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.291793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.291824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.291849 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.303551 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.310603 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.310670 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.310710 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.310763 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnmpq\" (UniqueName: \"kubernetes.io/projected/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-kube-api-access-vnmpq\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.319505 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.332636 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.347467 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.364063 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.384901 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.395526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.395574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.395590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.395611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.395628 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.399828 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.412485 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnmpq\" (UniqueName: \"kubernetes.io/projected/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-kube-api-access-vnmpq\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.412663 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.412728 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.412778 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.413913 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.413930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.419276 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.421291 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.435096 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.445334 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnmpq\" (UniqueName: \"kubernetes.io/projected/8c550cc8-1d4f-40dd-9eac-8f11c34663dc-kube-api-access-vnmpq\") pod \"ovnkube-control-plane-749d76644c-76slj\" (UID: \"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.453848 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.473032 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.489144 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 15:17:22.602284532 +0000 UTC Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.491973 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.498282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.498371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.498423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.498449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.498465 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.529801 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.551696 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.563831 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.569024 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.601047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.601086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.601099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.601120 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.601135 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.703734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.703785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.703794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.703810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.703821 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.805625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.805933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.805943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.805956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.805965 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.909271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.909332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.909351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.909376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.909428 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:22Z","lastTransitionTime":"2026-01-30T21:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.924231 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" event={"ID":"8c550cc8-1d4f-40dd-9eac-8f11c34663dc","Type":"ContainerStarted","Data":"3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.924304 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" event={"ID":"8c550cc8-1d4f-40dd-9eac-8f11c34663dc","Type":"ContainerStarted","Data":"f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.924327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" event={"ID":"8c550cc8-1d4f-40dd-9eac-8f11c34663dc","Type":"ContainerStarted","Data":"28fa604dbd58fca0ed7827e1f7e1a4c33d879e7054b3ee9603bc449b79f54133"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.926758 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/1.log" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.927867 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/0.log" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.933433 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f" exitCode=1 Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.933507 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f"} Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.933619 4834 scope.go:117] "RemoveContainer" containerID="23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.934658 4834 scope.go:117] "RemoveContainer" containerID="c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f" Jan 30 21:16:22 crc kubenswrapper[4834]: E0130 21:16:22.934920 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.946831 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.969449 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:22 crc kubenswrapper[4834]: I0130 21:16:22.987656 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:22Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.006843 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.011705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.011762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.011780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.011804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.011824 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.040569 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.069868 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.122507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.122550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.122563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.122582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.122592 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.122823 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.137512 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.151139 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.167970 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.185009 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.198699 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.213264 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.224836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.224886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.224895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.224918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.224929 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.235052 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.254265 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.287113 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.316205 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.328648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.328707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.328760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.328785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.328804 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.333473 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.348024 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.369382 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.401789 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.425036 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.435610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.435659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.435673 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.435698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.435718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.446453 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.467058 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.489918 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.490800 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 20:03:30.010334542 +0000 UTC Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.508618 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.525747 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.530590 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.530635 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.530597 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:23 crc kubenswrapper[4834]: E0130 21:16:23.530759 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:23 crc kubenswrapper[4834]: E0130 21:16:23.530909 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:23 crc kubenswrapper[4834]: E0130 21:16:23.531228 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.538947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.538980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.538989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.539007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.539018 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.543605 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.567469 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.589450 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.615418 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.636844 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:23Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.642509 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.642561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.642575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.642595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.642608 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.745952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.746478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.746521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.746559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.746588 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.850973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.851038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.851056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.851084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.851103 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.939723 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/1.log" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.953348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.953433 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.953453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.953478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:23 crc kubenswrapper[4834]: I0130 21:16:23.953497 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:23Z","lastTransitionTime":"2026-01-30T21:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.057071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.057141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.057154 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.057175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.057191 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.160619 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.160727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.160746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.160775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.160796 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.263943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.263986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.263996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.264013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.264025 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.368111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.368198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.368222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.368255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.368277 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.471435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.471548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.471563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.471588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.471606 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.491843 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:08:25.564262048 +0000 UTC Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.575212 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.575251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.575263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.575284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.575298 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.585647 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-j5pcw"] Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.586207 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:24 crc kubenswrapper[4834]: E0130 21:16:24.586279 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.619616 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.642531 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4j4\" (UniqueName: \"kubernetes.io/projected/f8a589ab-0e20-4c47-a923-363b3be97b20-kube-api-access-zq4j4\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.642628 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.643674 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.662421 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.685817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.685895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.685921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.685949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.685967 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.692025 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.718300 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.735907 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.743638 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.743782 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4j4\" (UniqueName: \"kubernetes.io/projected/f8a589ab-0e20-4c47-a923-363b3be97b20-kube-api-access-zq4j4\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:24 crc kubenswrapper[4834]: E0130 21:16:24.743904 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:24 crc kubenswrapper[4834]: E0130 21:16:24.744018 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:25.243986075 +0000 UTC m=+36.397132253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.757987 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.769714 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4j4\" (UniqueName: \"kubernetes.io/projected/f8a589ab-0e20-4c47-a923-363b3be97b20-kube-api-access-zq4j4\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.777361 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.792035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.792104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.792125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.792157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.792182 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.797935 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.818148 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.842171 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.858117 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.875254 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.895424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.895507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.895564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.895591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.895611 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.896686 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.915085 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.936720 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.958226 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:24Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.998962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.999007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.999021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.999039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:24 crc kubenswrapper[4834]: I0130 21:16:24.999053 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:24Z","lastTransitionTime":"2026-01-30T21:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.102559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.102646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.102663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.102721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.102740 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.206106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.206170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.206221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.206253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.206274 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.250659 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.250937 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.251046 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:26.251017422 +0000 UTC m=+37.404163590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.310420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.310477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.310499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.310525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.310547 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.351470 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.351684 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.351773 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:16:41.351715613 +0000 UTC m=+52.504861801 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.351858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.351893 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.351942 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352073 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:41.352034092 +0000 UTC m=+52.505180260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.352143 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352089 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352214 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352289 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352315 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:41.352274999 +0000 UTC m=+52.505421177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352319 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352365 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352430 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352455 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352486 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:41.352464414 +0000 UTC m=+52.505610772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.352540 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:41.352512066 +0000 UTC m=+52.505658394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.413814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.413868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.413877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.413899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.413913 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.493011 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 04:13:09.67605853 +0000 UTC Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.495116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.495164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.495175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.495199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.495212 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.511312 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:25Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.517282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.517322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.517331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.517348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.517368 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.530087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.530171 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.530280 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.530303 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.530446 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.530631 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.538659 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:25Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.545012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.545055 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.545070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.545093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.545116 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.566505 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:25Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.572072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.572150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.572171 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.572200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.572220 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.593872 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:25Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.599640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.599700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.599715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.599739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.599753 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.625689 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:25Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:25 crc kubenswrapper[4834]: E0130 21:16:25.625853 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.628116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.628181 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.628200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.628227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.628248 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.731718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.731791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.731817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.731853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.731871 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.835777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.835851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.835866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.835885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.835898 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.939199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.939286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.939311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.939347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:25 crc kubenswrapper[4834]: I0130 21:16:25.939372 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:25Z","lastTransitionTime":"2026-01-30T21:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.043175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.043264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.043293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.043332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.043362 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.147369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.147514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.147546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.147579 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.147605 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.250983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.251061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.251086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.251117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.251139 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.264438 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:26 crc kubenswrapper[4834]: E0130 21:16:26.264614 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:26 crc kubenswrapper[4834]: E0130 21:16:26.264697 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:28.264672754 +0000 UTC m=+39.417818922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.354068 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.354128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.354145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.354171 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.354190 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.457472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.457572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.457595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.457621 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.457639 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.493596 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:40:50.883888948 +0000 UTC Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.530035 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:26 crc kubenswrapper[4834]: E0130 21:16:26.530273 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.561022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.561076 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.561094 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.561123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.561142 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.664063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.664129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.664146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.664170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.664195 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.768170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.768258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.768276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.768300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.768318 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.871720 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.871793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.871816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.871846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.871869 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.975007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.975056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.975073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.975099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:26 crc kubenswrapper[4834]: I0130 21:16:26.975116 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:26Z","lastTransitionTime":"2026-01-30T21:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.077775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.077827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.077844 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.077871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.077889 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.181530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.181617 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.181642 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.181680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.181704 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.284659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.284993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.285173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.285311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.285496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.297670 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.318391 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.335981 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.357330 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.379522 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.389019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.389056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.389073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.389099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.389117 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.404223 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.423737 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.456893 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.478116 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.492188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.492245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.492270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.492307 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.492330 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.494464 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 00:58:03.604656857 +0000 UTC Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.496865 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.514856 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.530820 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:27 crc kubenswrapper[4834]: E0130 21:16:27.530969 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.531385 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:27 crc kubenswrapper[4834]: E0130 21:16:27.531551 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.531651 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:27 crc kubenswrapper[4834]: E0130 21:16:27.531769 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.546047 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.577816 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.594731 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.595883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.595943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.595961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.595987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.596005 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.618222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.642824 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.663360 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.681467 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:27Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.698627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.698700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.698728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.698764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.698790 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.802458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.802512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.802528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.802549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.802564 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.906760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.907228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.907388 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.907590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:27 crc kubenswrapper[4834]: I0130 21:16:27.907890 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:27Z","lastTransitionTime":"2026-01-30T21:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.011764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.011825 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.011842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.011879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.011897 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.114773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.114818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.114831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.114849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.114863 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.217694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.217767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.217787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.217812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.217831 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.288850 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:28 crc kubenswrapper[4834]: E0130 21:16:28.289048 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:28 crc kubenswrapper[4834]: E0130 21:16:28.289134 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:32.289109065 +0000 UTC m=+43.442255233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.321184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.321247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.321263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.321287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.321305 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.424478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.424546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.424563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.424588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.424605 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.495380 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 18:39:33.134736048 +0000 UTC Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.527888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.527955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.527974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.528001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.528021 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.530219 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:28 crc kubenswrapper[4834]: E0130 21:16:28.530412 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.631472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.631541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.631566 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.631596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.631619 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.734960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.735017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.735035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.735058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.735075 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.837924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.837990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.838014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.838044 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.838068 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.940838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.940908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.940927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.940953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:28 crc kubenswrapper[4834]: I0130 21:16:28.940972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:28Z","lastTransitionTime":"2026-01-30T21:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.043276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.043321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.043338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.043361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.043378 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.148337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.148705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.148904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.149053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.149176 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.252839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.252889 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.252907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.252933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.252953 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.355638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.355703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.355720 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.355749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.355770 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.459072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.459476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.459654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.459786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.459972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.497074 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 05:41:06.494716074 +0000 UTC Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.530364 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.531599 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.531597 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:29 crc kubenswrapper[4834]: E0130 21:16:29.532477 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:29 crc kubenswrapper[4834]: E0130 21:16:29.535350 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:29 crc kubenswrapper[4834]: E0130 21:16:29.535635 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.557874 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.564485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.564717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.564933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.565138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.565332 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.579222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.601428 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.622804 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.648084 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.669151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.669213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.669233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.669257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.669275 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.677647 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23eec25a4de5b36f922c5d5d1c0c756b1801e9a3d38bad09f0d57fc9bd3c618f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:20Z\\\",\\\"message\\\":\\\"r removal\\\\nI0130 21:16:20.235707 6070 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0130 21:16:20.235716 6070 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0130 21:16:20.235753 6070 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0130 21:16:20.235757 6070 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:20.235760 6070 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:20.235798 6070 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0130 21:16:20.235769 6070 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0130 21:16:20.235766 6070 handler.go:208] Removed *v1.Node event handler 2\\\\nI0130 21:16:20.235831 6070 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0130 21:16:20.235856 6070 handler.go:208] Removed *v1.Node event handler 7\\\\nI0130 21:16:20.235904 6070 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:20.236037 6070 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0130 21:16:20.236202 6070 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0130 21:16:20.236248 6070 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0130 21:16:20.236362 6070 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0130 21:16:20.236363 6070 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.690762 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.704941 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.718603 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.731765 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.745487 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.760278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.772598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.772649 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.772669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.772693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.772710 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.776095 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.790501 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.844253 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.857822 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.873057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:29Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.874638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.874696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.874708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.874725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.874737 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.977484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.977572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.977620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.977645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:29 crc kubenswrapper[4834]: I0130 21:16:29.977664 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:29Z","lastTransitionTime":"2026-01-30T21:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.081238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.081306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.081330 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.081378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.081436 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.184130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.184220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.184246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.184279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.184305 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.286722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.286801 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.286821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.286846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.286864 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.389544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.389605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.389622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.389646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.389667 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.493474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.493517 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.493528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.493544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.493555 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.498868 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 10:47:43.19181533 +0000 UTC Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.530472 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:30 crc kubenswrapper[4834]: E0130 21:16:30.530655 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.596538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.596608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.596633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.596665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.596690 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.700142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.700308 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.700339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.700440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.700521 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.803796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.803871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.803898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.803927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.803949 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.906851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.906929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.906948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.906972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:30 crc kubenswrapper[4834]: I0130 21:16:30.906991 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:30Z","lastTransitionTime":"2026-01-30T21:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.010348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.010469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.010494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.010521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.010540 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.114325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.114371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.114384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.114424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.114441 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.217900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.217958 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.217976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.218001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.218018 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.323896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.323999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.324023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.324053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.324075 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.427119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.427176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.427195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.427221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.427240 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.499776 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 13:44:39.891801118 +0000 UTC Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530068 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530105 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530121 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:31 crc kubenswrapper[4834]: E0130 21:16:31.530237 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:31 crc kubenswrapper[4834]: E0130 21:16:31.530379 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530568 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: E0130 21:16:31.530579 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.530608 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.634298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.634378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.634428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.634460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.634486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.736989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.737046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.737064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.737089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.737109 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.839868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.839926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.839943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.839967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.839989 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.943189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.943321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.943341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.943365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:31 crc kubenswrapper[4834]: I0130 21:16:31.943381 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:31Z","lastTransitionTime":"2026-01-30T21:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.045761 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.045826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.045847 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.045880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.045906 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.148271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.148319 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.148329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.148344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.148362 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.251741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.251815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.251838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.251872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.251895 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.339625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:32 crc kubenswrapper[4834]: E0130 21:16:32.339863 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:32 crc kubenswrapper[4834]: E0130 21:16:32.339952 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:40.339927584 +0000 UTC m=+51.493073762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.355014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.355075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.355092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.355114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.355134 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.458011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.458074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.458092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.458116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.458138 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.500802 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 11:02:30.713189722 +0000 UTC Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.530245 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:32 crc kubenswrapper[4834]: E0130 21:16:32.530451 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.561040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.561098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.561119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.561144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.561162 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.663805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.663846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.663858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.663874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.663886 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.767255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.767318 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.767336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.767362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.767380 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.870389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.870497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.870515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.870540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.870560 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.973783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.973848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.973867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.973892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:32 crc kubenswrapper[4834]: I0130 21:16:32.973911 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:32Z","lastTransitionTime":"2026-01-30T21:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.076989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.077021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.077030 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.077044 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.077053 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.180507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.180566 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.180583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.180608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.180625 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.283885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.283972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.284038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.284066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.284084 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.387596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.387668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.387692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.387725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.387751 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.490996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.491060 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.491079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.491106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.491127 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.501215 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 16:10:28.416500656 +0000 UTC Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.530811 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.530853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.530993 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:33 crc kubenswrapper[4834]: E0130 21:16:33.531145 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:33 crc kubenswrapper[4834]: E0130 21:16:33.531315 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:33 crc kubenswrapper[4834]: E0130 21:16:33.531487 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.594825 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.594887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.594905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.594947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.594965 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.698358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.698630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.698663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.698700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.698722 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.801594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.801660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.801683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.801712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.801735 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.904704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.904767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.904785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.904813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:33 crc kubenswrapper[4834]: I0130 21:16:33.904834 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:33Z","lastTransitionTime":"2026-01-30T21:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.008468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.008545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.008570 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.008602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.008621 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.114983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.115046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.115066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.115092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.115114 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.218663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.218789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.218810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.218836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.218853 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.322097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.322186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.322204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.322259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.322281 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.426052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.426118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.426135 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.426160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.426178 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.502183 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 07:36:00.744521165 +0000 UTC Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.528876 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.528928 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.528944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.528967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.528985 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.530385 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:34 crc kubenswrapper[4834]: E0130 21:16:34.530591 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.633002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.633065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.633087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.633114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.633134 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.736317 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.736358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.736369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.736387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.736424 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.839039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.839104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.839124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.839149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.839165 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.942322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.942386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.942438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.942472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:34 crc kubenswrapper[4834]: I0130 21:16:34.942493 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:34Z","lastTransitionTime":"2026-01-30T21:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.046065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.046129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.046149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.046173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.046192 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.148571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.148631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.148648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.148672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.148718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.252056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.252115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.252134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.252158 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.252178 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.355358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.355457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.355475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.355499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.355518 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.458667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.458727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.458748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.458776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.458795 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.502682 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 07:25:24.760010997 +0000 UTC Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.530241 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.530331 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.530445 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.530646 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.530798 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.531000 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.561646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.561704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.561721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.561744 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.561761 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.664687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.664764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.664783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.664812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.664832 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.768218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.768279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.768298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.768321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.768341 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.817495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.817563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.817581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.817607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.817624 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.838149 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:35Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.843258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.843328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.843352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.843381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.843441 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.865085 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:35Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.874608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.874672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.874691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.874717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.874736 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.895283 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:35Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.900312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.900366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.900387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.900439 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.900457 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.920817 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:35Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.925547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.925596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.925613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.925636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.925652 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.947454 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:35Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:35 crc kubenswrapper[4834]: E0130 21:16:35.947691 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.949946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.950008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.950025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.950053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:35 crc kubenswrapper[4834]: I0130 21:16:35.950071 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:35Z","lastTransitionTime":"2026-01-30T21:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.052691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.052750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.052768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.052797 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.052818 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.155904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.155970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.155988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.156013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.156032 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.259220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.259266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.259281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.259300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.259315 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.362282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.362325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.362341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.362364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.362384 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.464903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.464979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.465005 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.465037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.465059 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.484385 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.485664 4834 scope.go:117] "RemoveContainer" containerID="c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.503835 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:30:48.116346969 +0000 UTC Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.507368 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.526006 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.530097 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:36 crc kubenswrapper[4834]: E0130 21:16:36.530325 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.549843 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.568430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.568562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.568582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.568612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.568630 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.575747 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.600883 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.620487 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.653575 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.671851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.671913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.671932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.671957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.671976 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.681531 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.703303 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.725217 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.762595 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.774275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.774301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.774309 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.774322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.774332 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.779891 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.799306 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.817364 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.836155 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.853186 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.867733 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:36Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.877303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.877423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.877447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.877472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.877491 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.980548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.980607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.980627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.980656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.980678 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:36Z","lastTransitionTime":"2026-01-30T21:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:36 crc kubenswrapper[4834]: I0130 21:16:36.998351 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/1.log" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.002269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.003104 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.043149 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.069714 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.081754 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.083786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.083836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.083854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.083878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.083896 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.106020 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.119675 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.135682 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.149272 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.166731 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.181291 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.186688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.186733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.186743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.186756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.186765 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.207844 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.221881 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.237204 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.253180 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.266910 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.285885 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.289939 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.289998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.290019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.290047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.290065 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.300613 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.316989 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:37Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.393494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.393595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.393612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.393635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.393651 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.497526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.497630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.497654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.497686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.497710 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.504748 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:56:07.326871923 +0000 UTC Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.530145 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.530187 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.530256 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:37 crc kubenswrapper[4834]: E0130 21:16:37.530295 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:37 crc kubenswrapper[4834]: E0130 21:16:37.530486 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:37 crc kubenswrapper[4834]: E0130 21:16:37.530596 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.600588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.600818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.600826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.600839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.600848 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.703837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.703899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.703916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.703942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.703962 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.806728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.806803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.806824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.806851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.806870 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.909564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.909631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.909650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.909675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:37 crc kubenswrapper[4834]: I0130 21:16:37.909772 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:37Z","lastTransitionTime":"2026-01-30T21:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.008354 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/2.log" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.009198 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/1.log" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.012007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.012072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.012085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.012109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.012124 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.013848 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e" exitCode=1 Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.013903 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.013952 4834 scope.go:117] "RemoveContainer" containerID="c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.015060 4834 scope.go:117] "RemoveContainer" containerID="e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e" Jan 30 21:16:38 crc kubenswrapper[4834]: E0130 21:16:38.015303 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.054480 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.076776 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.095281 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.111587 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.116854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.116931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.116950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.116978 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.116991 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.133476 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.153670 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.172852 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.191896 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.216970 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.221497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.221693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.221769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.221826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.221856 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.250210 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c81e1458ca71bde88196c6fd4271d710c147205fc695a3eb6ff61ddcf6b6ea7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"message\\\":\\\"2 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0130 21:16:22.048909 6232 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.048950 6232 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0130 21:16:22.049165 6232 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049192 6232 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049230 6232 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049385 6232 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:22.049727 6232 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050109 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:22.050376 6232 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.268506 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.290722 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.306910 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.324873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.324938 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.324963 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.325183 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.325210 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.328639 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.349369 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.370419 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.390659 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:38Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.429698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.429762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.429782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.429812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.429834 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.505979 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 10:43:28.829459001 +0000 UTC Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.530766 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:38 crc kubenswrapper[4834]: E0130 21:16:38.530955 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.533540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.533610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.533632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.533660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.533680 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.637263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.637355 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.637383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.637466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.637494 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.740965 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.741033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.741051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.741077 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.741095 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.844631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.844704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.844722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.844746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.844763 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.948312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.948381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.948425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.948453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:38 crc kubenswrapper[4834]: I0130 21:16:38.948472 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:38Z","lastTransitionTime":"2026-01-30T21:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.022436 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/2.log" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.029105 4834 scope.go:117] "RemoveContainer" containerID="e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e" Jan 30 21:16:39 crc kubenswrapper[4834]: E0130 21:16:39.029389 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.054040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.054142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.054163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.054189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.054207 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.054847 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.080254 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.100287 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.139189 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.158358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.158444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.158462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.158490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.158510 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.160895 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.180026 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.203904 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.226305 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.247224 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.263465 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.263556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.263573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.263632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.263653 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.266797 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.291490 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.325027 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.344703 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.365881 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.366896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.367000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.367025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.367055 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.367080 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.389709 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.409857 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.431706 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.469806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.469854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.469868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.469886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.469898 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.506491 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:44:50.071107858 +0000 UTC Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.530781 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.530846 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:39 crc kubenswrapper[4834]: E0130 21:16:39.530924 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.530968 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:39 crc kubenswrapper[4834]: E0130 21:16:39.531129 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:39 crc kubenswrapper[4834]: E0130 21:16:39.531240 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.565451 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.575053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.575111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.575129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.575156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.575175 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.587807 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.616583 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.635651 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.657204 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.679320 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.679375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.679223 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.679421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.679643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.679672 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.706277 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.723069 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.743868 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.758555 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.781604 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.783533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.783583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.783601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.783623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.783640 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.801593 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.820475 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.837865 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.873487 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.886127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.886199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.886218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.886244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.886262 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.893278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.911480 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:39Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.989605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.989665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.989684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.989708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:39 crc kubenswrapper[4834]: I0130 21:16:39.989728 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:39Z","lastTransitionTime":"2026-01-30T21:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.093177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.093234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.093251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.093275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.093296 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.196954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.197037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.197057 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.197091 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.197112 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.300234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.300313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.300340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.300372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.300492 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.403871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.403924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.403943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.403970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.403993 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.434988 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:40 crc kubenswrapper[4834]: E0130 21:16:40.435242 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:40 crc kubenswrapper[4834]: E0130 21:16:40.435341 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:16:56.435312366 +0000 UTC m=+67.588458544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.506743 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 05:43:56.593176398 +0000 UTC Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.507451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.507547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.507559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.507580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.507600 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.530110 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:40 crc kubenswrapper[4834]: E0130 21:16:40.530311 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.611097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.611161 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.611179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.611204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.611224 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.720885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.721617 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.721840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.722017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.722143 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.826513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.826614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.826643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.826680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.826705 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.930475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.930548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.930571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.930609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:40 crc kubenswrapper[4834]: I0130 21:16:40.930635 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:40Z","lastTransitionTime":"2026-01-30T21:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.035294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.035380 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.035425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.035454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.035475 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.139698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.139769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.139788 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.139816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.139836 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.243737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.243789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.243807 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.243833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.243851 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.347325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.347769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.347988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.348153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.348300 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.446239 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.446562 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:17:13.446517968 +0000 UTC m=+84.599664146 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.446935 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.447237 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.447466 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.447668 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.447274 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.448104 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.448282 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.448535 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:17:13.448499704 +0000 UTC m=+84.601645882 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.447428 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.447733 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.447857 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.449794 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:17:13.449759869 +0000 UTC m=+84.602906107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.450272 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:17:13.450249773 +0000 UTC m=+84.603395941 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.450068 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.450579 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.450792 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:17:13.450760968 +0000 UTC m=+84.603907206 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.451593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.451652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.451671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.451699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.451717 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.507140 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:28:46.129684697 +0000 UTC Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.530644 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.530834 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.530919 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.531139 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.531469 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:41 crc kubenswrapper[4834]: E0130 21:16:41.531839 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.554323 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.554381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.554426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.554449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.554467 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.657882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.657932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.657949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.657981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.658005 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.760925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.760999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.761016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.761042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.761062 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.865155 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.865230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.865249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.865274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.865292 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.969085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.969160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.969179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.969207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:41 crc kubenswrapper[4834]: I0130 21:16:41.969227 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:41Z","lastTransitionTime":"2026-01-30T21:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.072982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.073029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.073038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.073052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.073063 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.175918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.176006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.176022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.176045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.176510 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.281461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.281534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.281552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.281581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.281600 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.385154 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.385494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.385592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.385680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.386136 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.489426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.489502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.489521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.489548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.489568 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.508021 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:34:45.726145611 +0000 UTC Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.530380 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:42 crc kubenswrapper[4834]: E0130 21:16:42.530804 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.592565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.593086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.593215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.593354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.593504 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.697066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.697486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.697680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.697832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.697980 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.801221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.801268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.801280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.801300 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.801314 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.905421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.905540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.905563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.905601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.905651 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:42Z","lastTransitionTime":"2026-01-30T21:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.941272 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.956889 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.978662 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:42Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:42 crc kubenswrapper[4834]: I0130 21:16:42.999791 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:42Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.010279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.010324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.010341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.010366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.010385 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.017223 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.034511 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.060601 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.093002 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.109811 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.113164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.113211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.113230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.113254 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.113272 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.132277 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.151895 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.172324 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.191917 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.215585 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.217146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.217230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.217251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.217285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.217307 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.233083 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.253902 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.275826 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.297296 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.316518 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:43Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.320834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.320890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.320909 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.320933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.320952 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.425867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.425957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.425981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.426007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.426027 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.508248 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:19:12.49676353 +0000 UTC Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.529246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.529315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.529334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.529365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.529385 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.530156 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.530197 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.530208 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:43 crc kubenswrapper[4834]: E0130 21:16:43.530307 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:43 crc kubenswrapper[4834]: E0130 21:16:43.530514 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:43 crc kubenswrapper[4834]: E0130 21:16:43.530611 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.632791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.632856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.632876 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.632901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.632918 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.736695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.736784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.736809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.736841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.736864 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.840769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.840845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.840868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.840901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.840923 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.944299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.944481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.944512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.944548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:43 crc kubenswrapper[4834]: I0130 21:16:43.944581 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:43Z","lastTransitionTime":"2026-01-30T21:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.048785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.048842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.048860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.048884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.048902 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.151799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.151867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.151891 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.151921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.151946 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.255148 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.255209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.255227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.255252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.255271 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.358976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.359032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.359050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.359074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.359091 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.462684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.462733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.462749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.462774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.462793 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.508965 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 03:33:45.184095834 +0000 UTC Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.530354 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:44 crc kubenswrapper[4834]: E0130 21:16:44.530594 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.565551 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.565605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.565623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.565647 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.565667 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.668741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.668810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.668829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.668857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.668876 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.772301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.772624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.772803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.773019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.773232 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.876181 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.876281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.876307 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.876342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.876366 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.979914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.980054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.980078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.980109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:44 crc kubenswrapper[4834]: I0130 21:16:44.980131 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:44Z","lastTransitionTime":"2026-01-30T21:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.083532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.083640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.083662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.083687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.083705 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.186350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.186454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.186474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.186495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.186514 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.289141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.289176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.289186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.289201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.289212 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.393153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.393215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.393232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.393257 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.393278 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.496040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.496114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.496132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.496157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.496175 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.509740 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 06:33:25.414167157 +0000 UTC Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.530162 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.530217 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:45 crc kubenswrapper[4834]: E0130 21:16:45.530334 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:45 crc kubenswrapper[4834]: E0130 21:16:45.530468 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.530502 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:45 crc kubenswrapper[4834]: E0130 21:16:45.530669 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.599853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.599912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.600161 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.600199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.600217 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.703549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.703608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.703625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.703654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.703677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.806900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.806944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.806956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.806977 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.806990 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.909171 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.909223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.909243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.909266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:45 crc kubenswrapper[4834]: I0130 21:16:45.909285 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:45Z","lastTransitionTime":"2026-01-30T21:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.012775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.013107 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.013244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.013383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.013548 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.117057 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.117122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.117140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.117164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.117185 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.220339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.220440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.220470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.220499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.220520 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.253320 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.253454 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.253472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.253493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.253514 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.274982 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.280183 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.280232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.280249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.280269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.280285 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.301053 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.306213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.306278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.306378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.306444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.306520 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.325877 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.334024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.334070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.334088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.334110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.334129 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.355558 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.360127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.360170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.360186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.360211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.360229 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.380195 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:46Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.380539 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.382709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.382763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.382781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.382805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.382823 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.486497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.486542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.486558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.486582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.486599 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.510541 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:46:53.545153151 +0000 UTC Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.529921 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:46 crc kubenswrapper[4834]: E0130 21:16:46.530102 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.590013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.590073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.590092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.590118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.590137 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.693013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.693079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.693095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.693119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.693141 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.796443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.796505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.796523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.796545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.796564 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.899455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.899534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.899552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.899583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:46 crc kubenswrapper[4834]: I0130 21:16:46.899603 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:46Z","lastTransitionTime":"2026-01-30T21:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.003706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.003780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.003804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.003835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.003857 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.106747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.106796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.106848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.106875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.106894 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.209602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.209666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.209682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.209705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.209724 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.313311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.313377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.313423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.313448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.313465 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.415790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.415821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.415830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.415844 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.415856 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.510755 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 18:46:09.766505298 +0000 UTC Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.517529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.517584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.517603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.517625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.517643 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.530942 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.530982 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.531067 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:47 crc kubenswrapper[4834]: E0130 21:16:47.531188 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:47 crc kubenswrapper[4834]: E0130 21:16:47.531460 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:47 crc kubenswrapper[4834]: E0130 21:16:47.531738 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.620374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.620479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.620497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.620526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.620547 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.723113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.723157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.723168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.723188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.723200 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.826669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.826750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.826780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.826813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.826837 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.930113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.930156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.930166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.930183 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:47 crc kubenswrapper[4834]: I0130 21:16:47.930195 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:47Z","lastTransitionTime":"2026-01-30T21:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.032915 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.032970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.033008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.033047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.033072 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.135776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.135838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.135856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.135882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.135904 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.239645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.239706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.239723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.239750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.239770 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.342862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.342995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.343013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.343037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.343054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.446472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.446500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.446508 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.446521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.446530 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.511462 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 18:26:48.079459111 +0000 UTC Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.530009 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:48 crc kubenswrapper[4834]: E0130 21:16:48.530265 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.549706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.549757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.549774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.549796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.549811 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.652984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.653024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.653033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.653048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.653059 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.756150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.756220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.756231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.756252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.756263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.859604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.859934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.860093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.860253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.860430 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.964284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.964626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.964731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.964994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:48 crc kubenswrapper[4834]: I0130 21:16:48.965094 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:48Z","lastTransitionTime":"2026-01-30T21:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.067521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.067610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.067637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.067668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.067686 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.171338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.171447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.171472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.171503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.171526 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.275213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.275284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.275316 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.275345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.275363 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.378786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.378870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.378896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.378925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.378949 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.480731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.480774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.480786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.480803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.480815 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.512280 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 11:15:15.598140834 +0000 UTC Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.530004 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.530035 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.530117 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:49 crc kubenswrapper[4834]: E0130 21:16:49.530306 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:49 crc kubenswrapper[4834]: E0130 21:16:49.530522 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:49 crc kubenswrapper[4834]: E0130 21:16:49.530670 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.551501 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.566878 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.583479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.583524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.583540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.583564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.583582 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.589048 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.611925 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.634365 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.654269 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.669094 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.686813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.686870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.686891 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.686917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.686935 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.693344 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.707347 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.721209 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.736887 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.753007 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.773713 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.790112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.790174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.790200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.790232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.790259 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.791325 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.804758 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.819429 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.836023 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.852676 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:49Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.893173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.893214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.893223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.893239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.893251 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.996132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.996175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.996184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.996200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:49 crc kubenswrapper[4834]: I0130 21:16:49.996211 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:49Z","lastTransitionTime":"2026-01-30T21:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.098992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.099060 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.099079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.099104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.099123 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.201722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.201784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.201799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.201823 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.201839 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.304560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.304612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.304629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.304651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.304668 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.407988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.408036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.408053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.408075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.408092 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.510471 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.510521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.510539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.510562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.510579 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.613305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.613353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.613370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.613424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.613450 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.715793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.715901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.715928 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.715955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.715976 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.785788 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:50 crc kubenswrapper[4834]: E0130 21:16:50.785975 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.786314 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.786576 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:50 crc kubenswrapper[4834]: E0130 21:16:50.786690 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:50 crc kubenswrapper[4834]: E0130 21:16:50.786564 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.786881 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 02:15:15.692460801 +0000 UTC Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.787631 4834 scope.go:117] "RemoveContainer" containerID="e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e" Jan 30 21:16:50 crc kubenswrapper[4834]: E0130 21:16:50.787974 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.819872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.819918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.819934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.819957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.819976 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.924190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.924955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.925606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.925671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:50 crc kubenswrapper[4834]: I0130 21:16:50.925696 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:50Z","lastTransitionTime":"2026-01-30T21:16:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.030312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.030376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.030423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.030450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.030468 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.133026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.133106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.133126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.133157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.133238 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.236945 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.237007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.237023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.237048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.237070 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.339882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.339969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.339993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.340022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.340082 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.442985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.443020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.443028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.443041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.443050 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.530488 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:51 crc kubenswrapper[4834]: E0130 21:16:51.530758 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.544846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.544911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.544928 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.544951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.544970 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.648391 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.648512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.648530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.648557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.648579 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.751572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.751636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.751652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.751678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.751695 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.787208 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 10:26:13.36879402 +0000 UTC Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.855770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.855831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.855851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.855879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.855899 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.959114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.959179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.959196 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.959220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:51 crc kubenswrapper[4834]: I0130 21:16:51.959238 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:51Z","lastTransitionTime":"2026-01-30T21:16:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.062473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.062602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.062627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.062655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.062672 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.167270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.167415 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.167435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.167461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.167480 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.270639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.270700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.270719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.270747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.270767 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.374637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.374708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.374727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.374755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.374777 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.478701 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.478780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.478814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.478848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.478868 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.530528 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.530633 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:52 crc kubenswrapper[4834]: E0130 21:16:52.530708 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.530957 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:52 crc kubenswrapper[4834]: E0130 21:16:52.531231 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:52 crc kubenswrapper[4834]: E0130 21:16:52.531506 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.582114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.582178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.582201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.582233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.582252 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.686458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.686520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.686538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.686569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.686592 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.787378 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 03:22:47.718915146 +0000 UTC Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.789935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.789980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.790041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.790151 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.790170 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.893420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.893476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.893493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.893518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.893537 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.996357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.996701 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.996721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.996748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:52 crc kubenswrapper[4834]: I0130 21:16:52.996767 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:52Z","lastTransitionTime":"2026-01-30T21:16:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.100002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.100087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.100110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.100140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.100164 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.203000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.203045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.203063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.203085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.203102 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.306341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.306413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.306428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.306469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.306486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.409413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.409456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.409465 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.409479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.409488 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.513004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.513069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.513085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.513103 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.513138 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.531096 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:53 crc kubenswrapper[4834]: E0130 21:16:53.531328 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.616693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.616754 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.616772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.616798 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.616816 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.719914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.719967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.719984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.720006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.720024 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.787634 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 12:50:50.434267205 +0000 UTC Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.823567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.823604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.823612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.823653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.823664 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.926895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.926944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.926961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.926986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:53 crc kubenswrapper[4834]: I0130 21:16:53.927004 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:53Z","lastTransitionTime":"2026-01-30T21:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.030479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.030550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.030566 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.030584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.030597 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.133297 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.133352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.133368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.133417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.133436 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.236153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.236215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.236230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.236248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.236263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.338990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.339052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.339074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.339104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.339124 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.441535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.441582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.441595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.441611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.441624 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.530888 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.530982 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:54 crc kubenswrapper[4834]: E0130 21:16:54.531071 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.530981 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:54 crc kubenswrapper[4834]: E0130 21:16:54.531155 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:54 crc kubenswrapper[4834]: E0130 21:16:54.531242 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.545131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.545187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.545205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.545230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.545251 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.647853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.647929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.647951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.647985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.648008 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.749877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.749907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.749915 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.749927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.749936 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.788623 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 17:56:31.820091695 +0000 UTC Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.852664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.852713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.852731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.852758 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.852776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.956634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.956691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.956717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.956747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:54 crc kubenswrapper[4834]: I0130 21:16:54.956771 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:54Z","lastTransitionTime":"2026-01-30T21:16:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.059369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.059430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.059440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.059457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.059467 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.162369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.162421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.162432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.162447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.162459 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.265453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.265492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.265501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.265540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.265553 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.368153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.368235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.368254 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.368280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.368298 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.470424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.470453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.470462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.470477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.470486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.530439 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:55 crc kubenswrapper[4834]: E0130 21:16:55.530620 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.572790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.572873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.572893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.572920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.572940 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.675121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.675176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.675195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.675219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.675236 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.777655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.777702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.777718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.777738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.777750 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.789342 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:59:03.799544546 +0000 UTC Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.880207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.880265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.880291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.880309 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.880331 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.983432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.983492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.983502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.983522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:55 crc kubenswrapper[4834]: I0130 21:16:55.983533 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:55Z","lastTransitionTime":"2026-01-30T21:16:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.086159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.086218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.086228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.086251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.086263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.189126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.189259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.189280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.189305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.189323 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.292643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.292726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.292738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.292759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.292770 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.396334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.396406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.396417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.396438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.396449 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.499289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.499326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.499342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.499364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.499379 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.525985 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.526089 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.526139 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:17:28.526124198 +0000 UTC m=+99.679270336 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.530466 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.530523 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.530556 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.530766 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.531125 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.531486 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.602639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.602676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.602687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.602731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.602746 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.705987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.706036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.706046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.706062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.706076 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.721367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.721406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.721416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.721427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.721436 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.739050 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.743949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.743977 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.743986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.744002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.744014 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.762805 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.770298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.770372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.770391 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.770447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.770465 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.790380 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 14:00:38.828115068 +0000 UTC Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.790566 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.796302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.796350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.796366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.796419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.796482 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.818870 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.825518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.825597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.825620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.825649 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.825670 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.842209 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:56Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:56 crc kubenswrapper[4834]: E0130 21:16:56.842668 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.845592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.845672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.845693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.845723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.845747 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.948641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.948734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.948752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.948776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:56 crc kubenswrapper[4834]: I0130 21:16:56.948794 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:56Z","lastTransitionTime":"2026-01-30T21:16:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.052294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.052347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.052359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.052379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.052406 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.155841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.155884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.155896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.155911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.155923 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.259546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.259595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.259605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.259624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.259636 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.362708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.362751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.362760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.362776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.362787 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.466380 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.466469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.466497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.466516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.466528 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.531022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:57 crc kubenswrapper[4834]: E0130 21:16:57.531263 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.570661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.570865 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.570877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.570895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.570909 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.674123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.674166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.674179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.674198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.674208 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.777447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.777506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.777516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.777536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.777548 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.790932 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:14:39.458806945 +0000 UTC Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.880681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.880737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.880755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.880784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.880801 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.984224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.984267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.984279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.984421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:57 crc kubenswrapper[4834]: I0130 21:16:57.984443 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:57Z","lastTransitionTime":"2026-01-30T21:16:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.087331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.087375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.087385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.087421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.087434 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.103037 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/0.log" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.103087 4834 generic.go:334] "Generic (PLEG): container finished" podID="25f6f1cd-cd4b-475a-85a3-4e81cda5d203" containerID="0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25" exitCode=1 Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.103121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerDied","Data":"0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.103532 4834 scope.go:117] "RemoveContainer" containerID="0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.120419 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.140636 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.154489 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.167320 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.187063 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.191230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.191280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.191293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.191315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.191329 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.204847 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.218484 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.230537 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.250837 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.269696 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.286367 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.293962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.294000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.294013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.294036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.294050 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.303267 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.317061 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.333382 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.359045 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.370326 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.385720 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.395906 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.395949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.395966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.395988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.396004 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.399452 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:58Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.499996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.500049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.500070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.500094 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.500111 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.530504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.530548 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.530564 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:16:58 crc kubenswrapper[4834]: E0130 21:16:58.530612 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:16:58 crc kubenswrapper[4834]: E0130 21:16:58.530724 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:16:58 crc kubenswrapper[4834]: E0130 21:16:58.530814 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.602831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.602893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.602910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.602937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.602958 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.705457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.705530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.705541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.705565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.705576 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.791162 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 03:53:45.106856246 +0000 UTC Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.808313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.808354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.808363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.808380 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.808409 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.911671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.911709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.911722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.911739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:58 crc kubenswrapper[4834]: I0130 21:16:58.911749 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:58Z","lastTransitionTime":"2026-01-30T21:16:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.014303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.014353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.014363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.014382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.014410 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.106953 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/0.log" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.107028 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerStarted","Data":"280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.116173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.116216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.116226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.116241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.116253 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.130572 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.149212 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.164708 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.187121 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.198877 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.211984 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.218519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.218560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.218568 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.218586 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.218597 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.222772 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.234960 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.254717 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.264282 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.275450 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.287701 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.304237 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.321047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.321095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.321109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.321128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.321139 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.327018 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.344993 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.357602 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.369464 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.384762 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.424311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.424369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.424388 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.424446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.424473 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.527116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.527192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.527214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.527244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.527262 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.530545 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:16:59 crc kubenswrapper[4834]: E0130 21:16:59.530719 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.554509 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.576102 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.598746 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.614136 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.628001 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.629690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.629747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.629765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.629790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.629809 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.647833 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.661273 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.684717 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.700679 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.720216 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.732679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.732722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.732734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.732755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.732769 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.740455 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.757965 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.774030 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.792310 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 15:00:05.685229819 +0000 UTC Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.794638 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.809183 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.826326 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.835762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.835859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.835889 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.835926 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.835952 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.841767 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.871851 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:16:59Z is after 2025-08-24T17:21:41Z" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.939057 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.939117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.939128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.939149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:16:59 crc kubenswrapper[4834]: I0130 21:16:59.939162 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:16:59Z","lastTransitionTime":"2026-01-30T21:16:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.042479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.042542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.042554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.042575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.042588 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.145145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.145211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.145232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.145261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.145280 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.248247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.248302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.248316 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.248340 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.248355 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.350434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.350494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.350505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.350525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.350537 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.453755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.453822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.453883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.453908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.453919 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.530473 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:00 crc kubenswrapper[4834]: E0130 21:17:00.530615 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.530809 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:00 crc kubenswrapper[4834]: E0130 21:17:00.530872 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.530996 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:00 crc kubenswrapper[4834]: E0130 21:17:00.531041 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.556964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.557046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.557059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.557083 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.557097 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.659700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.659742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.659756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.659778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.659794 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.762485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.762528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.762537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.762553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.762564 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.794136 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 21:06:06.270147999 +0000 UTC Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.865331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.865377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.865412 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.865435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.865451 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.969975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.970042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.970065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.970093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:00 crc kubenswrapper[4834]: I0130 21:17:00.970111 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:00Z","lastTransitionTime":"2026-01-30T21:17:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.072842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.072949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.072972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.073003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.073023 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.176000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.176042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.176052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.176068 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.176080 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.279544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.279583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.279593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.279608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.279618 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.382829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.382866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.382879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.382893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.382903 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.485222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.485264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.485280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.485295 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.485305 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.530284 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:01 crc kubenswrapper[4834]: E0130 21:17:01.530464 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.531889 4834 scope.go:117] "RemoveContainer" containerID="e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.588098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.588140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.588149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.588166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.588177 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.690764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.690822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.690836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.690859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.690871 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.793648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.793725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.793742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.793768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.793784 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.794322 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 07:01:04.702940893 +0000 UTC Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.897174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.897230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.897243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.897262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:01 crc kubenswrapper[4834]: I0130 21:17:01.897273 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:01Z","lastTransitionTime":"2026-01-30T21:17:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.000348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.000384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.000407 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.000423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.000434 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.104339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.104385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.104419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.107648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.107704 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.120804 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/2.log" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.125987 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.126847 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.147306 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.165586 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.189433 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.211339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.211382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.211416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.211437 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.211452 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.215882 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.242997 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.261136 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.284201 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.298331 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.310356 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.313332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.313358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.313371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.313387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.313415 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.326344 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.338161 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.353815 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.369916 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.386848 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.400970 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.415998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.416053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.416067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.416086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.416098 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.417002 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.442367 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.456510 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:02Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.519922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.519978 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.519997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.520020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.520037 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.530888 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.530958 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.531014 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:02 crc kubenswrapper[4834]: E0130 21:17:02.531084 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:02 crc kubenswrapper[4834]: E0130 21:17:02.531201 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:02 crc kubenswrapper[4834]: E0130 21:17:02.531311 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.622884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.622921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.622930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.622948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.622958 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.726468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.726544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.726568 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.726606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.726630 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.794549 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:32:10.962040682 +0000 UTC Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.829601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.829678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.829702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.829733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.829755 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.932832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.932948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.932975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.933004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:02 crc kubenswrapper[4834]: I0130 21:17:02.933025 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:02Z","lastTransitionTime":"2026-01-30T21:17:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.036357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.036440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.036459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.036489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.036512 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.132901 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/3.log" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.133628 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/2.log" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.137696 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" exitCode=1 Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.137735 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.137770 4834 scope.go:117] "RemoveContainer" containerID="e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.138174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.138236 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.138252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.138276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.138296 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.138278 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:17:03 crc kubenswrapper[4834]: E0130 21:17:03.138560 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.208193 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.231005 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.241228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.241286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.241310 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.241339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.241363 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.248513 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.264337 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.286075 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.315544 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e91ef761d028dc44bc7ecb8a418b568fe1a5548a4179412ee06faabe7f130e2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:37Z\\\",\\\"message\\\":\\\"4821 6442 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.524990 6442 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0130 21:16:37.525039 6442 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525088 6442 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525133 6442 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0130 21:16:37.525795 6442 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0130 21:16:37.525889 6442 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0130 21:16:37.525899 6442 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0130 21:16:37.525933 6442 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0130 21:16:37.525954 6442 factory.go:656] Stopping watch factory\\\\nI0130 21:16:37.525962 6442 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0130 21:16:37.525973 6442 ovnkube.go:599] Stopped ovnkube\\\\nI0130 21\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:17:02Z\\\",\\\"message\\\":\\\"trics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 21:17:02.508102 6827 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI0130 21:17:02.508257 6827 obj_retry.go:409] Going to retry *v1.Pod resource setup for 15 objects: [openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-4xmxm openshift-dns/node-resolver-42cwb openshift-multus/network-metrics-daemon-j5pcw openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj openshift-kube-controller-manager/kube-controller-manager-crc openshift-machine-config-operator/machine-config-daemon-drghn openshift-multus/multus-5655f openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-image-registry/node-ca-sqm85]\\\\nF0130 21:17:02.508264 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:17:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.329754 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.343739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.343784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.343803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.343826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.343843 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.346079 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.360841 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.374029 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.394517 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.415805 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.431280 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.446209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.446231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.446240 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.446253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.446261 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.448513 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.465837 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.481116 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.495356 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.510049 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:03Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.530565 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:03 crc kubenswrapper[4834]: E0130 21:17:03.530748 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.548134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.548176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.548193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.548213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.548230 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.650713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.650753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.650766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.650783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.650793 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.752652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.752708 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.752727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.752751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.752768 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.795237 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 10:05:56.284109547 +0000 UTC Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.856156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.856221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.856238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.856263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.856281 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.962740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.962800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.962819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.962846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:03 crc kubenswrapper[4834]: I0130 21:17:03.962865 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:03Z","lastTransitionTime":"2026-01-30T21:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.066256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.066311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.066333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.066365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.066386 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.144049 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/3.log" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.150370 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:17:04 crc kubenswrapper[4834]: E0130 21:17:04.152448 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.165320 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.170162 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.170216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.170241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.170270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.170296 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.178273 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.193726 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.212825 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.229366 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.246946 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.260803 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.274416 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.280067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.280155 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.280174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.280205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.280225 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.291856 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.306501 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.318831 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.336246 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.360512 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.384221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.384268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.384288 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.384313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.384332 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.386589 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:17:02Z\\\",\\\"message\\\":\\\"trics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 21:17:02.508102 6827 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI0130 21:17:02.508257 6827 obj_retry.go:409] Going to retry *v1.Pod resource setup for 15 objects: [openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-4xmxm openshift-dns/node-resolver-42cwb openshift-multus/network-metrics-daemon-j5pcw openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj openshift-kube-controller-manager/kube-controller-manager-crc openshift-machine-config-operator/machine-config-daemon-drghn openshift-multus/multus-5655f openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-image-registry/node-ca-sqm85]\\\\nF0130 21:17:02.508264 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.405001 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.422772 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.438345 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.460267 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:04Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.489782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.489849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.489871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.489898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.489914 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.530976 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.531044 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.530971 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:04 crc kubenswrapper[4834]: E0130 21:17:04.531172 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:04 crc kubenswrapper[4834]: E0130 21:17:04.531319 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:04 crc kubenswrapper[4834]: E0130 21:17:04.531693 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.592787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.592862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.592904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.592935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.592957 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.696428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.696511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.696533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.696567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.696589 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.796505 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 13:27:30.567098118 +0000 UTC Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.800321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.800344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.800355 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.800370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.800383 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.903703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.903763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.903782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.903812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:04 crc kubenswrapper[4834]: I0130 21:17:04.903827 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:04Z","lastTransitionTime":"2026-01-30T21:17:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.006804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.006886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.006901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.006942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.006955 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.109085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.109129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.109141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.109159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.109171 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.212828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.212892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.212923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.212959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.212983 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.316771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.316843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.316861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.316888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.316907 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.420436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.420512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.420532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.420598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.420620 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.528589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.528683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.528699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.528731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.528749 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.530278 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:05 crc kubenswrapper[4834]: E0130 21:17:05.530428 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.632423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.632460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.632486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.632510 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.632530 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.736173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.736222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.736238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.736261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.736278 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.796859 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:45:42.140456976 +0000 UTC Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.839523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.839589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.839600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.839626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.839641 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.942557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.942620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.942645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.942676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:05 crc kubenswrapper[4834]: I0130 21:17:05.942699 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:05Z","lastTransitionTime":"2026-01-30T21:17:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.046471 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.046518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.046535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.046560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.046578 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.149199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.149265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.149282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.149308 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.149329 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.253180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.253247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.253265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.253290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.253311 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.356786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.356846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.356867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.356896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.356919 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.459753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.459828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.459857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.459890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.459912 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.530643 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.530738 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:06 crc kubenswrapper[4834]: E0130 21:17:06.530890 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.531034 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:06 crc kubenswrapper[4834]: E0130 21:17:06.531214 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:06 crc kubenswrapper[4834]: E0130 21:17:06.531770 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.563299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.563341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.563365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.563423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.563449 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.665992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.666043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.666066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.666094 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.666116 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.768980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.769023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.769043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.769070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.769092 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.797840 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 07:00:05.061524664 +0000 UTC Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.872971 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.873031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.873048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.873072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.873093 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.976605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.976673 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.976692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.976720 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:06 crc kubenswrapper[4834]: I0130 21:17:06.976741 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:06Z","lastTransitionTime":"2026-01-30T21:17:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.079957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.080022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.080041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.080066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.080086 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.182276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.182343 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.182364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.182389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.182434 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.198061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.198281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.198434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.198597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.198729 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.219902 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.224931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.224997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.225024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.225071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.225095 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.246194 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.250419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.250496 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.250542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.250564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.250581 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.269043 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.272903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.272973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.272999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.273030 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.273055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.293857 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.298301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.298339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.298352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.298369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.298384 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.312774 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:07Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.312939 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.314879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.314909 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.314920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.314937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.314954 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.417732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.417796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.417814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.417839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.417859 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.520675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.520742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.520759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.520784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.520802 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.530115 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:07 crc kubenswrapper[4834]: E0130 21:17:07.530300 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.623856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.623918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.623937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.623963 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.624004 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.727306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.727376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.727416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.727443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.727462 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.798232 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:43:10.920403108 +0000 UTC Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.830350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.830440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.830458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.830484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.830505 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.934205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.934273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.934290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.934318 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:07 crc kubenswrapper[4834]: I0130 21:17:07.934337 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:07Z","lastTransitionTime":"2026-01-30T21:17:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.038443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.038519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.038538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.038564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.038586 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.142186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.142265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.142293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.142324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.142345 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.246851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.246886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.246897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.246914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.246926 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.351235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.351345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.351365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.351387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.351466 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.454960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.455027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.455046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.455072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.455090 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.530125 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.530244 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:08 crc kubenswrapper[4834]: E0130 21:17:08.530345 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.530252 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:08 crc kubenswrapper[4834]: E0130 21:17:08.530474 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:08 crc kubenswrapper[4834]: E0130 21:17:08.530592 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.559183 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.559290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.559310 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.559371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.559426 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.662953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.663022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.663039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.663064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.663085 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.765753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.765836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.765855 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.765882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.765901 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.799158 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 04:12:50.086359405 +0000 UTC Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.870043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.870120 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.870145 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.870180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.870203 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.973739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.973816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.973834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.973860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:08 crc kubenswrapper[4834]: I0130 21:17:08.973883 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:08Z","lastTransitionTime":"2026-01-30T21:17:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.077618 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.077686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.077706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.077732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.077752 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.179960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.180018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.180035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.180059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.180128 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.282979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.283059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.283084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.283114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.283136 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.385791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.385857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.385888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.385918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.385940 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.488596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.488678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.488706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.488731 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.488751 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.531143 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:09 crc kubenswrapper[4834]: E0130 21:17:09.531448 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.561911 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.588162 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.598092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.598173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.598196 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.598223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.598248 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.611826 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.643115 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.664357 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.683100 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.702232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.702289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.702306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.702332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.702350 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.705392 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.724980 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.761608 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.795629 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:17:02Z\\\",\\\"message\\\":\\\"trics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 21:17:02.508102 6827 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI0130 21:17:02.508257 6827 obj_retry.go:409] Going to retry *v1.Pod resource setup for 15 objects: [openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-4xmxm openshift-dns/node-resolver-42cwb openshift-multus/network-metrics-daemon-j5pcw openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj openshift-kube-controller-manager/kube-controller-manager-crc openshift-machine-config-operator/machine-config-daemon-drghn openshift-multus/multus-5655f openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-image-registry/node-ca-sqm85]\\\\nF0130 21:17:02.508264 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.799411 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:34:41.775571885 +0000 UTC Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.804181 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.804209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.804218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.804234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.804245 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.808857 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.822059 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.855295 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.871999 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.885521 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.902920 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.907002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.907046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.907058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.907075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.907088 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:09Z","lastTransitionTime":"2026-01-30T21:17:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.913627 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:09 crc kubenswrapper[4834]: I0130 21:17:09.932001 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:09Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.010645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.010707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.010727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.010755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.010776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.113156 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.113229 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.113251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.113278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.113297 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.216135 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.216230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.216252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.216279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.216299 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.319453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.319507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.319522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.319546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.319563 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.422622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.422678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.422696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.422721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.422740 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.526586 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.526651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.526663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.526686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.526704 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.530021 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.530107 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.530123 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:10 crc kubenswrapper[4834]: E0130 21:17:10.530223 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:10 crc kubenswrapper[4834]: E0130 21:17:10.530303 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:10 crc kubenswrapper[4834]: E0130 21:17:10.530533 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.629067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.629111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.629123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.629142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.629152 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.732326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.732385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.732423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.732443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.732455 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.799650 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 15:35:38.745587314 +0000 UTC Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.835291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.835350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.835368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.835420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.835440 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.937716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.937781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.937799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.937846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:10 crc kubenswrapper[4834]: I0130 21:17:10.937864 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:10Z","lastTransitionTime":"2026-01-30T21:17:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.045940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.046085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.046159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.046193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.046220 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.149838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.150231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.150248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.150272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.150290 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.253758 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.253843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.253867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.253899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.253957 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.356834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.356927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.356946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.356972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.357176 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.460812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.460885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.460903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.460931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.460949 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.530754 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:11 crc kubenswrapper[4834]: E0130 21:17:11.530932 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.564493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.564549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.564572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.564601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.564623 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.667637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.667706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.667722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.667746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.667766 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.771000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.771059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.771081 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.771106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.771124 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.800460 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 00:55:45.275818747 +0000 UTC Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.873975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.874032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.874053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.874078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.874096 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.977474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.977612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.977632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.977659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:11 crc kubenswrapper[4834]: I0130 21:17:11.977677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:11Z","lastTransitionTime":"2026-01-30T21:17:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.080188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.080230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.080241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.080258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.080270 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.182773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.182808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.182817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.182831 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.182841 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.285913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.285962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.285974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.285992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.286003 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.389184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.389245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.389266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.389291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.389310 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.492746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.492818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.492837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.492863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.492880 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.530675 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.530721 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.530748 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:12 crc kubenswrapper[4834]: E0130 21:17:12.530849 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:12 crc kubenswrapper[4834]: E0130 21:17:12.530996 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:12 crc kubenswrapper[4834]: E0130 21:17:12.531133 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.596434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.596498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.596518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.596542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.596560 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.700159 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.700220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.700239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.700264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.700281 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.801269 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 22:59:08.83247198 +0000 UTC Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.807556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.807633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.807674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.807714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.807743 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.911341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.911458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.911479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.911506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:12 crc kubenswrapper[4834]: I0130 21:17:12.911525 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:12Z","lastTransitionTime":"2026-01-30T21:17:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.014944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.015012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.015037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.015064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.015119 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.117912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.117998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.118026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.118056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.118080 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.221920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.222017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.222045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.222079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.222102 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.325378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.325519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.325538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.325565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.325592 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.429297 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.429370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.429423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.429452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.429473 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.530867 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.531084 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.532640 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.532851 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.532903 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.532951 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.532998 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533049 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533147 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.533117168 +0000 UTC m=+148.686263306 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533172 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533199 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533217 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533284 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.533259802 +0000 UTC m=+148.686405980 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533368 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533385 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533431 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533474 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.533460368 +0000 UTC m=+148.686606546 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533565 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.533555991 +0000 UTC m=+148.686702129 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.532896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.533623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.533634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.533658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.533671 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533718 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: E0130 21:17:13.533769 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.533751907 +0000 UTC m=+148.686898075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.636459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.636527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.636545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.636576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.636594 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.740202 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.740287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.740314 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.740349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.740372 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.801743 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:02:02.094610032 +0000 UTC Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.843503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.843575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.843593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.844046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.844112 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.948085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.948134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.948150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.948173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:13 crc kubenswrapper[4834]: I0130 21:17:13.948191 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:13Z","lastTransitionTime":"2026-01-30T21:17:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.051760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.051828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.051846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.051873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.051892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.155364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.155488 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.155505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.155527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.155544 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.258351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.258462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.258489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.258520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.258537 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.362283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.362361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.362385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.362451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.362513 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.465627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.465692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.465709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.465735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.465753 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.530294 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:14 crc kubenswrapper[4834]: E0130 21:17:14.530515 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.530619 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.530717 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:14 crc kubenswrapper[4834]: E0130 21:17:14.531095 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:14 crc kubenswrapper[4834]: E0130 21:17:14.531218 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.531579 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:17:14 crc kubenswrapper[4834]: E0130 21:17:14.531849 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.569903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.569988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.570005 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.570028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.570047 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.673836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.673920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.673940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.673970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.673993 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.777493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.777581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.777606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.777634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.777655 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.803539 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:44:00.278155928 +0000 UTC Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.882149 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.882221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.882243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.882273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.882295 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.987884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.987964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.987990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.988028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:14 crc kubenswrapper[4834]: I0130 21:17:14.988054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:14Z","lastTransitionTime":"2026-01-30T21:17:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.091751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.091814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.091829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.091849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.091864 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.194948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.195506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.195519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.195537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.195549 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.299274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.299342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.299362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.299389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.299464 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.402715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.402776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.402793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.402819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.402837 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.506635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.506699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.506713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.506743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.506765 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.530054 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:15 crc kubenswrapper[4834]: E0130 21:17:15.530220 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.609620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.609664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.609674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.609689 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.609700 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.712853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.712924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.712944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.712970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.712989 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.804094 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:20:58.422118891 +0000 UTC Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.816846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.816913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.816934 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.816960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.816978 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.919325 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.919385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.919435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.919463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:15 crc kubenswrapper[4834]: I0130 21:17:15.919485 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:15Z","lastTransitionTime":"2026-01-30T21:17:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.022375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.022459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.022480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.022503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.022521 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.125812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.125877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.125897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.125922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.125942 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.228458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.228517 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.228534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.228562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.228580 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.331988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.332065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.332083 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.332107 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.332125 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.435545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.435624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.435650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.435677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.435699 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.529923 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.530009 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:16 crc kubenswrapper[4834]: E0130 21:17:16.530117 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:16 crc kubenswrapper[4834]: E0130 21:17:16.530182 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.530020 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:16 crc kubenswrapper[4834]: E0130 21:17:16.530282 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.538075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.538132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.538150 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.538171 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.538191 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.641423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.641467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.641476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.641490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.641502 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.744155 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.744252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.744281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.744314 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.744341 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.805313 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 00:39:18.988618135 +0000 UTC Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.847743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.847807 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.847824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.847849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.847867 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.951245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.951307 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.951328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.951356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:16 crc kubenswrapper[4834]: I0130 21:17:16.951437 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:16Z","lastTransitionTime":"2026-01-30T21:17:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.054369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.054464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.054490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.054518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.054542 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.158014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.158072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.158091 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.158115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.158133 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.261152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.261217 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.261234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.261258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.261276 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.364329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.364466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.364484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.364506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.364523 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.467815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.467868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.467884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.467904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.467921 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.530762 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.531037 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.570051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.570105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.570121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.570142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.570158 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.574878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.574922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.574937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.574956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.574972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.595909 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.601552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.601637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.601656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.601677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.601694 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.624474 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.629152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.629197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.629214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.629292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.629310 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.651211 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.656801 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.656886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.656911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.656947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.656973 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.679585 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.684679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.684736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.684759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.684788 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.684810 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.705809 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-30T21:17:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b49f675e-147a-40a2-ab31-7b9d1f2d710c\\\",\\\"systemUUID\\\":\\\"a8c42df5-e7c6-43f3-b21d-2acb5110253c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:17Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:17 crc kubenswrapper[4834]: E0130 21:17:17.706121 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.708187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.708244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.708262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.708285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.708343 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.805485 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 17:43:54.765978944 +0000 UTC Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.810591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.810670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.810689 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.810715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.810733 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.913570 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.913628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.913700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.913776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:17 crc kubenswrapper[4834]: I0130 21:17:17.913812 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:17Z","lastTransitionTime":"2026-01-30T21:17:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.016146 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.016241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.016262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.016333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.016352 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.119704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.119776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.119794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.119822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.119849 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.223152 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.223223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.223241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.223263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.223282 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.326362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.326443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.326464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.326492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.326509 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.429774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.429824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.429844 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.429867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.429885 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.530873 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.530892 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.530954 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:18 crc kubenswrapper[4834]: E0130 21:17:18.531050 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:18 crc kubenswrapper[4834]: E0130 21:17:18.531314 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:18 crc kubenswrapper[4834]: E0130 21:17:18.531429 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.533993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.534040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.534058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.534079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.534095 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.638208 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.638279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.638298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.638324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.638345 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.741811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.741871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.741890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.741914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.741931 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.806004 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 09:07:01.432430999 +0000 UTC Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.845110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.845186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.845207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.845233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.845251 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.949633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.949727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.949755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.949784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:18 crc kubenswrapper[4834]: I0130 21:17:18.949805 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:18Z","lastTransitionTime":"2026-01-30T21:17:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.052490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.052552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.052569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.052595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.052614 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.155941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.156008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.156026 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.156052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.156071 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.258859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.258922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.258940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.258964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.258982 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.362122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.362166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.362175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.362188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.362197 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.464703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.464773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.464791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.464818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.464838 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.530774 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:19 crc kubenswrapper[4834]: E0130 21:17:19.531134 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.543957 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.556379 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7f721ec-6832-449b-a0ce-13e548448b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27d600151e6309486506a2a25f3c201965bb43f8ad8c74046caa568cee2663a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b30f4316613abd12ab100bdd90ca55897533884cb54be5c90bf7a85c77a24d38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc83e188d0f2034d20e4198bfb68eb8d15da121444f655947695255b1c0539af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaceba3acf0a1213558d27efd7f0a4d7fe39831b428b2ac02552589f136871c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55244228a5a0eb4d8207c8a8ad11ddeedea9144cb6f34bc55879d970faccaac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04f4f9ff70dac6525c576f86758226904e81d685a9a337e9771764a553a2d695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ba087c5d676984cefcf3b1525f233fd9f3297c8a7da4c86621f5cc6ffa261bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d71fd38e801ce2f50393bf1442e92635648fa801a4232973cc33f5bcb1d55b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.568305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.568420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.568443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.568473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.568573 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.576228 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa14d9da-c19a-4958-8c9f-06a0c7967200\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05e56340b969bc36e9ba7d8bf2625366673638fe1d3c8c4cda3248de82fe59a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70243085c2efe3822427d6c570401d8a09a3509aaca4a079c86d5f2f9c9ad9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e243e0c754fa4a63383e010ff28f1f4772b79055f71944af761e8ecdc7c6685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db59914ca5f0cb8ee0f233b1a831820f59bf835af69a09d79240a5c52bb84535\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.595682 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4cdf27a1647c782b6a12646498f607fc548c4cbe5c3121924c27d17fe964b37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddfeaba99b48324c5c06a6561996ab7e418650e80035b042f8b7239bb2923b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.612043 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a589ab-0e20-4c47-a923-363b3be97b20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zq4j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j5pcw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.627388 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sqm85" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d3a851f-fc14-4b9c-b9c1-a92da4b27262\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f230994251fc20ba6b9e1e075f0a107016134c9b1cab4072b2b141aa5fdf0a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2qm9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sqm85\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.644561 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf366c66f888b3316861a04af50d9ae389e8a8d32afe3292eed5217999fc63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.664078 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.671313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.671468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.671491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.671553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.671573 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.682153 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.697018 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"296cf2a5-374e-4730-9d40-8abb93c8e237\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7e7ea06b0554e6a18954868a2ee07fac8b257f89328ba966661038fa1289ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drghn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.714540 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64f88d18-0675-4d43-82c3-23acaafb56c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c297d72c52ba5b8d2cd87d4a80b7700316e5809b0e90b38b5d9586bebebe2d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://490f853b134965129508bbfb11fa8a6cac3dd746ad100f6b40175903402b8d5b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50b828dbcdb26bbed5fd16ebf00ba77c643ec7b5bf5f802d82b6b069b772de96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3071015c4db1c868d9ffca408d4dc98b3ccbdbb33da90231f872ca44f31536b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e229192312e6cbde1655562aeba31b73173d2e017454b3a8a7c42fa0a4e05c57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bee436668927e130a4911adb4a4b0310ecc1501b1d62f97bdb5ec1ae5e9df19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://279556c0198beb34070fb6db19e7e760b7d0d5247ece210c26015b42eb456bdf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9h82\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2m7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.741133 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1133254b-8923-414d-8031-4dfe81f17e12\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:17:02Z\\\",\\\"message\\\":\\\"trics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0130 21:17:02.508102 6827 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nI0130 21:17:02.508257 6827 obj_retry.go:409] Going to retry *v1.Pod resource setup for 15 objects: [openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-4xmxm openshift-dns/node-resolver-42cwb openshift-multus/network-metrics-daemon-j5pcw openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-kube-scheduler/openshift-kube-scheduler-crc openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj openshift-kube-controller-manager/kube-controller-manager-crc openshift-machine-config-operator/machine-config-daemon-drghn openshift-multus/multus-5655f openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-image-registry/node-ca-sqm85]\\\\nF0130 21:17:02.508264 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:17:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkdbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4xmxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.759278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c550cc8-1d4f-40dd-9eac-8f11c34663dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f35a26ebbc65b920b8a7eca2368393ca252404ff32d725ff1a57b38afab686c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f8d2d8b25535c5c5eea4f613ca32d85a6e591e1eea2bcea06821f44ab4e48cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vnmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-76slj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.774478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.774536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.774558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.774587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.774605 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.782007 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.799618 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-42cwb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96ed93d8-d6ab-42f5-8c10-cfc941d1931e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ed65f8c99c13b7864e9d23c2eaca3fbdc2e081b2f1cd65de5c2749ffd6c8625\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v7pfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:09Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-42cwb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.806797 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 21:15:02.670276517 +0000 UTC Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.820250 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5655f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25f6f1cd-cd4b-475a-85a3-4e81cda5d203\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-30T21:16:57Z\\\",\\\"message\\\":\\\"2026-01-30T21:16:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e\\\\n2026-01-30T21:16:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_36f5a035-8e6f-46b1-898e-f79b0df3d98e to /host/opt/cni/bin/\\\\n2026-01-30T21:16:12Z [verbose] multus-daemon started\\\\n2026-01-30T21:16:12Z [verbose] Readiness Indicator file check\\\\n2026-01-30T21:16:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gw9kq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:16:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5655f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.842008 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ccb7af7f-dc56-4806-be9c-cce94d47c10e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-30T21:16:08Z\\\",\\\"message\\\":\\\":]:17697\\\\nI0130 21:16:08.666524 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI0130 21:16:08.666546 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666569 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0130 21:16:08.666595 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3996480195/tls.crt::/tmp/serving-cert-3996480195/tls.key\\\\\\\"\\\\nI0130 21:16:08.666726 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI0130 21:16:08.667064 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667082 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0130 21:16:08.667107 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0130 21:16:08.667113 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0130 21:16:08.667190 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0130 21:16:08.667203 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI0130 21:16:08.669322 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670051 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI0130 21:16:08.670578 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF0130 21:16:08.673633 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-30T21:15:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.861838 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d78b1bb-3cf0-4d60-bf4c-db112d72abd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-30T21:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796f136fdf2270a61553f81330f7ac7b42837a4c07fac8e592d407a11beea516\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d6d7846c85f897b3f85f68663c718ffa5e7884bc96bfed5d7867b4c8bea89de\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f8d8865d042faed839878e6d323cfffb4355626e2fb48e1dc03c9ea0e649835\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:15:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-30T21:15:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.876997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.877186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.877308 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.877480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.877618 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.879587 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-30T21:16:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68d1c63a91be6820414b58cb0bb28ea05f2fc2ceee92e68c539216ae5131603e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-30T21:16:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-30T21:17:19Z is after 2025-08-24T17:21:41Z" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.980517 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.980581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.980599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.980622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:19 crc kubenswrapper[4834]: I0130 21:17:19.980642 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:19Z","lastTransitionTime":"2026-01-30T21:17:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.083967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.084034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.084052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.084081 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.084099 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.187262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.187347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.187369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.187431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.187459 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.291993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.292105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.292132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.292172 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.292201 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.396750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.396836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.396856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.396881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.396902 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.500803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.500893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.500914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.500946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.500972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.531103 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.531193 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.531192 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:20 crc kubenswrapper[4834]: E0130 21:17:20.531366 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:20 crc kubenswrapper[4834]: E0130 21:17:20.531594 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:20 crc kubenswrapper[4834]: E0130 21:17:20.531703 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.604484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.604552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.604576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.604603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.604625 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.708013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.708100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.708123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.708154 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.708179 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.807450 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 19:28:14.816308847 +0000 UTC Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.811211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.811285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.811305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.811335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.811355 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.915882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.915954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.915975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.916007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:20 crc kubenswrapper[4834]: I0130 21:17:20.916035 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:20Z","lastTransitionTime":"2026-01-30T21:17:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.019741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.019794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.019813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.019838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.019857 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.124845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.124888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.124905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.124929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.124948 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.227837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.227887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.227905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.227929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.227945 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.331136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.331205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.331223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.331249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.331274 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.434665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.435074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.435095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.435502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.435819 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.530333 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:21 crc kubenswrapper[4834]: E0130 21:17:21.530556 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.538243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.538299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.538321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.538347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.538370 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.641634 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.641686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.641702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.641726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.641775 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.743739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.743793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.743813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.743840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.743860 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.808586 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:09:39.848847504 +0000 UTC Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.847574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.847628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.847645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.847670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.847688 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.950931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.950995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.951015 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.951044 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:21 crc kubenswrapper[4834]: I0130 21:17:21.951068 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:21Z","lastTransitionTime":"2026-01-30T21:17:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.054629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.054684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.054702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.054730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.054749 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.158113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.158177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.158199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.158226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.158246 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.261603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.261661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.261679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.261704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.261722 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.365069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.365125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.365142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.365170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.365189 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.468369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.468494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.468524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.468557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.468590 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.530733 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.530768 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:22 crc kubenswrapper[4834]: E0130 21:17:22.530966 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.530768 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:22 crc kubenswrapper[4834]: E0130 21:17:22.531157 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:22 crc kubenswrapper[4834]: E0130 21:17:22.531328 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.576385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.576520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.576540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.576583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.576607 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.679552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.679621 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.679639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.679665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.679683 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.782074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.782134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.782153 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.782206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.782226 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.808766 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 16:20:50.665525054 +0000 UTC Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.886182 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.886262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.886281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.886305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.886323 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.990681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.990743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.990761 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.990785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:22 crc kubenswrapper[4834]: I0130 21:17:22.990802 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:22Z","lastTransitionTime":"2026-01-30T21:17:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.093781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.093846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.093864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.093893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.093912 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.197081 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.197142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.197163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.197188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.197210 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.301319 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.301362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.301379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.301443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.301464 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.404389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.404472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.404490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.404515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.404534 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.508572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.508654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.508676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.508714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.508743 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.531042 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:23 crc kubenswrapper[4834]: E0130 21:17:23.531219 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.612925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.613010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.613033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.613069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.613097 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.716006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.716085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.716112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.716141 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.716160 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.809244 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:52:56.133886475 +0000 UTC Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.818675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.818781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.818802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.818827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.818845 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.922018 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.922082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.922100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.922125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:23 crc kubenswrapper[4834]: I0130 21:17:23.922146 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:23Z","lastTransitionTime":"2026-01-30T21:17:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.025348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.025491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.025552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.025585 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.025604 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.128383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.128501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.128522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.128548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.128566 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.232959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.233035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.233048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.233066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.233076 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.336803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.336866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.336887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.336916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.336934 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.440573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.440650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.440675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.440884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.440905 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.530666 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.530689 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.530733 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:24 crc kubenswrapper[4834]: E0130 21:17:24.530923 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:24 crc kubenswrapper[4834]: E0130 21:17:24.531037 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:24 crc kubenswrapper[4834]: E0130 21:17:24.531264 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.544660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.544722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.544739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.544764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.544782 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.647786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.647842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.647858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.647881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.647897 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.750982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.751056 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.751080 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.751109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.751131 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.810078 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 22:07:43.270364162 +0000 UTC Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.853913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.853984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.854002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.854028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.854049 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.957143 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.957211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.957230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.957255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:24 crc kubenswrapper[4834]: I0130 21:17:24.957273 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:24Z","lastTransitionTime":"2026-01-30T21:17:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.060110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.060164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.060182 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.060208 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.060227 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.162732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.162824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.162874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.162899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.162918 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.265664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.265730 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.265749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.265776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.265796 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.369491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.369564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.369582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.369609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.369628 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.473709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.473779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.473796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.473823 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.473841 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.530996 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:25 crc kubenswrapper[4834]: E0130 21:17:25.531271 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.532495 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:17:25 crc kubenswrapper[4834]: E0130 21:17:25.532768 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.576711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.576764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.576783 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.576805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.576824 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.680729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.680799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.680817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.680845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.680868 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.784575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.784641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.784662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.784693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.784713 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.810285 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 16:45:40.789901348 +0000 UTC Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.888753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.888829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.888858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.888891 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.888915 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.992114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.992160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.992176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.992199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:25 crc kubenswrapper[4834]: I0130 21:17:25.992216 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:25Z","lastTransitionTime":"2026-01-30T21:17:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.095565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.095638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.095661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.095691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.095715 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.199078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.199147 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.199164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.199189 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.199208 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.302597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.302664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.302681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.302713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.302730 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.405657 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.405777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.405806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.405830 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.405846 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.508636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.508700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.508717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.508742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.508760 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.530912 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.530950 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.531062 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:26 crc kubenswrapper[4834]: E0130 21:17:26.531064 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:26 crc kubenswrapper[4834]: E0130 21:17:26.531167 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:26 crc kubenswrapper[4834]: E0130 21:17:26.531560 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.611992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.612070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.612096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.612122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.612141 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.716322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.716430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.716458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.716487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.716516 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.810523 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 23:33:50.656996209 +0000 UTC Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.819878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.819943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.819959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.819983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.820000 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.922660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.922692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.922700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.922712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:26 crc kubenswrapper[4834]: I0130 21:17:26.922722 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:26Z","lastTransitionTime":"2026-01-30T21:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.026166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.026228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.026243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.026266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.026282 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.128942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.128997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.129016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.129040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.129057 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.232660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.232747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.232770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.232800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.232823 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.336274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.336332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.336351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.336380 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.336426 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.439818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.439951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.439986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.440019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.440041 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.530794 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:27 crc kubenswrapper[4834]: E0130 21:17:27.531018 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.542873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.542920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.542937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.542959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.542978 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.647524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.647595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.647614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.647642 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.647660 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.750673 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.750746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.750763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.750787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.750805 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.811467 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 13:08:52.771348397 +0000 UTC Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.853317 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.853401 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.853450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.853478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.853496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.956368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.956481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.956505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.956533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:27 crc kubenswrapper[4834]: I0130 21:17:27.956554 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:27Z","lastTransitionTime":"2026-01-30T21:17:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.059132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.059199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.059218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.059245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.059263 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:28Z","lastTransitionTime":"2026-01-30T21:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.107865 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.107927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.107947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.107974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.107991 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-30T21:17:28Z","lastTransitionTime":"2026-01-30T21:17:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.170800 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5"] Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.171376 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.174052 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.174167 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.174349 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.178962 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.187059 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sqm85" podStartSLOduration=79.18702814 podStartE2EDuration="1m19.18702814s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.185521236 +0000 UTC m=+99.338667414" watchObservedRunningTime="2026-01-30 21:17:28.18702814 +0000 UTC m=+99.340174318" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.219263 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.219226047 podStartE2EDuration="9.219226047s" podCreationTimestamp="2026-01-30 21:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.204233396 +0000 UTC m=+99.357379544" watchObservedRunningTime="2026-01-30 21:17:28.219226047 +0000 UTC m=+99.372372225" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.277451 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podStartSLOduration=79.277402368 podStartE2EDuration="1m19.277402368s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.277271984 +0000 UTC m=+99.430418162" watchObservedRunningTime="2026-01-30 21:17:28.277402368 +0000 UTC m=+99.430548516" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.298785 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j2m7n" podStartSLOduration=79.298759326 podStartE2EDuration="1m19.298759326s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.298653673 +0000 UTC m=+99.451799841" watchObservedRunningTime="2026-01-30 21:17:28.298759326 +0000 UTC m=+99.451905474" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.314513 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05488ca9-bc58-4daa-af1c-d041d8f76db1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.314590 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/05488ca9-bc58-4daa-af1c-d041d8f76db1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.314713 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05488ca9-bc58-4daa-af1c-d041d8f76db1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.314780 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05488ca9-bc58-4daa-af1c-d041d8f76db1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.314855 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/05488ca9-bc58-4daa-af1c-d041d8f76db1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.346978 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-76slj" podStartSLOduration=78.346724506 podStartE2EDuration="1m18.346724506s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.345941553 +0000 UTC m=+99.499087731" watchObservedRunningTime="2026-01-30 21:17:28.346724506 +0000 UTC m=+99.499870644" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418584 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05488ca9-bc58-4daa-af1c-d041d8f76db1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418631 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05488ca9-bc58-4daa-af1c-d041d8f76db1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418650 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/05488ca9-bc58-4daa-af1c-d041d8f76db1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418699 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05488ca9-bc58-4daa-af1c-d041d8f76db1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418717 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/05488ca9-bc58-4daa-af1c-d041d8f76db1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418790 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/05488ca9-bc58-4daa-af1c-d041d8f76db1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.418844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/05488ca9-bc58-4daa-af1c-d041d8f76db1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.419783 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05488ca9-bc58-4daa-af1c-d041d8f76db1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.423794 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-42cwb" podStartSLOduration=79.423764712 podStartE2EDuration="1m19.423764712s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.423563376 +0000 UTC m=+99.576709524" watchObservedRunningTime="2026-01-30 21:17:28.423764712 +0000 UTC m=+99.576910860" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.425966 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05488ca9-bc58-4daa-af1c-d041d8f76db1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.447262 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5655f" podStartSLOduration=79.447238782 podStartE2EDuration="1m19.447238782s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.446075018 +0000 UTC m=+99.599221166" watchObservedRunningTime="2026-01-30 21:17:28.447238782 +0000 UTC m=+99.600384940" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.448383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05488ca9-bc58-4daa-af1c-d041d8f76db1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5cbz5\" (UID: \"05488ca9-bc58-4daa-af1c-d041d8f76db1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.493743 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.501587 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.495179012 podStartE2EDuration="1m19.495179012s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.471339261 +0000 UTC m=+99.624485409" watchObservedRunningTime="2026-01-30 21:17:28.495179012 +0000 UTC m=+99.648325160" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.525303 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.525275707 podStartE2EDuration="1m16.525275707s" podCreationTimestamp="2026-01-30 21:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.504775794 +0000 UTC m=+99.657921942" watchObservedRunningTime="2026-01-30 21:17:28.525275707 +0000 UTC m=+99.678421885" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.534365 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:28 crc kubenswrapper[4834]: E0130 21:17:28.534561 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.534654 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.534746 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:28 crc kubenswrapper[4834]: E0130 21:17:28.534980 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:28 crc kubenswrapper[4834]: E0130 21:17:28.535378 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.585888 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.585865009 podStartE2EDuration="46.585865009s" podCreationTimestamp="2026-01-30 21:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.585673543 +0000 UTC m=+99.738819691" watchObservedRunningTime="2026-01-30 21:17:28.585865009 +0000 UTC m=+99.739011157" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.586606 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.5865932 podStartE2EDuration="1m20.5865932s" podCreationTimestamp="2026-01-30 21:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:28.569010263 +0000 UTC m=+99.722156411" watchObservedRunningTime="2026-01-30 21:17:28.5865932 +0000 UTC m=+99.739739358" Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.621487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:28 crc kubenswrapper[4834]: E0130 21:17:28.621688 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:17:28 crc kubenswrapper[4834]: E0130 21:17:28.621778 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs podName:f8a589ab-0e20-4c47-a923-363b3be97b20 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:32.621756694 +0000 UTC m=+163.774902932 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs") pod "network-metrics-daemon-j5pcw" (UID: "f8a589ab-0e20-4c47-a923-363b3be97b20") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.812294 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 09:42:34.593359821 +0000 UTC Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.812739 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 30 21:17:28 crc kubenswrapper[4834]: I0130 21:17:28.824450 4834 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 30 21:17:29 crc kubenswrapper[4834]: I0130 21:17:29.258952 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" event={"ID":"05488ca9-bc58-4daa-af1c-d041d8f76db1","Type":"ContainerStarted","Data":"b521afe56831d549690c9ef43484d491a9dde0362d5310a8395264ac4fe6845d"} Jan 30 21:17:29 crc kubenswrapper[4834]: I0130 21:17:29.259057 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" event={"ID":"05488ca9-bc58-4daa-af1c-d041d8f76db1","Type":"ContainerStarted","Data":"3f52c0ec036d4f4f455f2bd6d6f182562e7e5205d0e3dc3e3cd9f9ec41ad4cdb"} Jan 30 21:17:29 crc kubenswrapper[4834]: I0130 21:17:29.283020 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5cbz5" podStartSLOduration=80.282999938 podStartE2EDuration="1m20.282999938s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:29.282173584 +0000 UTC m=+100.435319812" watchObservedRunningTime="2026-01-30 21:17:29.282999938 +0000 UTC m=+100.436146086" Jan 30 21:17:29 crc kubenswrapper[4834]: I0130 21:17:29.532987 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:29 crc kubenswrapper[4834]: E0130 21:17:29.533222 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:30 crc kubenswrapper[4834]: I0130 21:17:30.530538 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:30 crc kubenswrapper[4834]: I0130 21:17:30.530538 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:30 crc kubenswrapper[4834]: E0130 21:17:30.530759 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:30 crc kubenswrapper[4834]: E0130 21:17:30.530996 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:30 crc kubenswrapper[4834]: I0130 21:17:30.530568 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:30 crc kubenswrapper[4834]: E0130 21:17:30.531161 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:31 crc kubenswrapper[4834]: I0130 21:17:31.530973 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:31 crc kubenswrapper[4834]: E0130 21:17:31.531657 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:32 crc kubenswrapper[4834]: I0130 21:17:32.530360 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:32 crc kubenswrapper[4834]: I0130 21:17:32.530454 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:32 crc kubenswrapper[4834]: I0130 21:17:32.530573 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:32 crc kubenswrapper[4834]: E0130 21:17:32.530748 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:32 crc kubenswrapper[4834]: E0130 21:17:32.530924 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:32 crc kubenswrapper[4834]: E0130 21:17:32.531196 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:33 crc kubenswrapper[4834]: I0130 21:17:33.530160 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:33 crc kubenswrapper[4834]: E0130 21:17:33.530717 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:34 crc kubenswrapper[4834]: I0130 21:17:34.530621 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:34 crc kubenswrapper[4834]: E0130 21:17:34.531287 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:34 crc kubenswrapper[4834]: I0130 21:17:34.530756 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:34 crc kubenswrapper[4834]: E0130 21:17:34.531614 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:34 crc kubenswrapper[4834]: I0130 21:17:34.530642 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:34 crc kubenswrapper[4834]: E0130 21:17:34.531828 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:35 crc kubenswrapper[4834]: I0130 21:17:35.530680 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:35 crc kubenswrapper[4834]: E0130 21:17:35.531756 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:36 crc kubenswrapper[4834]: I0130 21:17:36.529990 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:36 crc kubenswrapper[4834]: I0130 21:17:36.530055 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:36 crc kubenswrapper[4834]: I0130 21:17:36.530170 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:36 crc kubenswrapper[4834]: E0130 21:17:36.530291 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:36 crc kubenswrapper[4834]: E0130 21:17:36.530504 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:36 crc kubenswrapper[4834]: E0130 21:17:36.530624 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:37 crc kubenswrapper[4834]: I0130 21:17:37.530982 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:37 crc kubenswrapper[4834]: E0130 21:17:37.531220 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:38 crc kubenswrapper[4834]: I0130 21:17:38.530615 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:38 crc kubenswrapper[4834]: I0130 21:17:38.530743 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:38 crc kubenswrapper[4834]: E0130 21:17:38.530843 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:38 crc kubenswrapper[4834]: I0130 21:17:38.530860 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:38 crc kubenswrapper[4834]: E0130 21:17:38.530981 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:38 crc kubenswrapper[4834]: E0130 21:17:38.531139 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:39 crc kubenswrapper[4834]: I0130 21:17:39.530576 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:39 crc kubenswrapper[4834]: E0130 21:17:39.532820 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:39 crc kubenswrapper[4834]: I0130 21:17:39.533235 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:17:39 crc kubenswrapper[4834]: E0130 21:17:39.533528 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4xmxm_openshift-ovn-kubernetes(1133254b-8923-414d-8031-4dfe81f17e12)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" Jan 30 21:17:40 crc kubenswrapper[4834]: I0130 21:17:40.530905 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:40 crc kubenswrapper[4834]: I0130 21:17:40.530949 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:40 crc kubenswrapper[4834]: I0130 21:17:40.530963 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:40 crc kubenswrapper[4834]: E0130 21:17:40.531101 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:40 crc kubenswrapper[4834]: E0130 21:17:40.531205 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:40 crc kubenswrapper[4834]: E0130 21:17:40.531308 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:41 crc kubenswrapper[4834]: I0130 21:17:41.530858 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:41 crc kubenswrapper[4834]: E0130 21:17:41.531038 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:42 crc kubenswrapper[4834]: I0130 21:17:42.530072 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:42 crc kubenswrapper[4834]: I0130 21:17:42.530122 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:42 crc kubenswrapper[4834]: E0130 21:17:42.530235 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:42 crc kubenswrapper[4834]: I0130 21:17:42.530265 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:42 crc kubenswrapper[4834]: E0130 21:17:42.530467 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:42 crc kubenswrapper[4834]: E0130 21:17:42.530806 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:43 crc kubenswrapper[4834]: I0130 21:17:43.530054 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:43 crc kubenswrapper[4834]: E0130 21:17:43.530194 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.316539 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/1.log" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.317282 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/0.log" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.317350 4834 generic.go:334] "Generic (PLEG): container finished" podID="25f6f1cd-cd4b-475a-85a3-4e81cda5d203" containerID="280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349" exitCode=1 Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.317431 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerDied","Data":"280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349"} Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.317494 4834 scope.go:117] "RemoveContainer" containerID="0080a663783c0fc85566aed2169d65aec97138f3e8728fc51c8eacca4f5c8c25" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.318298 4834 scope.go:117] "RemoveContainer" containerID="280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349" Jan 30 21:17:44 crc kubenswrapper[4834]: E0130 21:17:44.318700 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5655f_openshift-multus(25f6f1cd-cd4b-475a-85a3-4e81cda5d203)\"" pod="openshift-multus/multus-5655f" podUID="25f6f1cd-cd4b-475a-85a3-4e81cda5d203" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.530295 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:44 crc kubenswrapper[4834]: E0130 21:17:44.530486 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.530747 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:44 crc kubenswrapper[4834]: E0130 21:17:44.530831 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:44 crc kubenswrapper[4834]: I0130 21:17:44.531034 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:44 crc kubenswrapper[4834]: E0130 21:17:44.531119 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:45 crc kubenswrapper[4834]: I0130 21:17:45.322764 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/1.log" Jan 30 21:17:45 crc kubenswrapper[4834]: I0130 21:17:45.530838 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:45 crc kubenswrapper[4834]: E0130 21:17:45.531070 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:46 crc kubenswrapper[4834]: I0130 21:17:46.530632 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:46 crc kubenswrapper[4834]: I0130 21:17:46.530701 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:46 crc kubenswrapper[4834]: E0130 21:17:46.530946 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:46 crc kubenswrapper[4834]: E0130 21:17:46.531087 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:46 crc kubenswrapper[4834]: I0130 21:17:46.531596 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:46 crc kubenswrapper[4834]: E0130 21:17:46.531781 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:47 crc kubenswrapper[4834]: I0130 21:17:47.530838 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:47 crc kubenswrapper[4834]: E0130 21:17:47.531183 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:48 crc kubenswrapper[4834]: I0130 21:17:48.530581 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:48 crc kubenswrapper[4834]: I0130 21:17:48.530594 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:48 crc kubenswrapper[4834]: I0130 21:17:48.530594 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:48 crc kubenswrapper[4834]: E0130 21:17:48.530789 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:48 crc kubenswrapper[4834]: E0130 21:17:48.531076 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:48 crc kubenswrapper[4834]: E0130 21:17:48.531162 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:49 crc kubenswrapper[4834]: I0130 21:17:49.530434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:49 crc kubenswrapper[4834]: E0130 21:17:49.532301 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:49 crc kubenswrapper[4834]: E0130 21:17:49.562198 4834 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 30 21:17:49 crc kubenswrapper[4834]: E0130 21:17:49.664124 4834 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:17:50 crc kubenswrapper[4834]: I0130 21:17:50.530627 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:50 crc kubenswrapper[4834]: I0130 21:17:50.530751 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:50 crc kubenswrapper[4834]: I0130 21:17:50.531240 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:50 crc kubenswrapper[4834]: E0130 21:17:50.531371 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:50 crc kubenswrapper[4834]: E0130 21:17:50.531531 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:50 crc kubenswrapper[4834]: E0130 21:17:50.531868 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:51 crc kubenswrapper[4834]: I0130 21:17:51.531040 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:51 crc kubenswrapper[4834]: E0130 21:17:51.532341 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:52 crc kubenswrapper[4834]: I0130 21:17:52.529959 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:52 crc kubenswrapper[4834]: I0130 21:17:52.529983 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:52 crc kubenswrapper[4834]: I0130 21:17:52.530144 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:52 crc kubenswrapper[4834]: E0130 21:17:52.530701 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:52 crc kubenswrapper[4834]: E0130 21:17:52.530769 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:52 crc kubenswrapper[4834]: E0130 21:17:52.530948 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:53 crc kubenswrapper[4834]: I0130 21:17:53.530355 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:53 crc kubenswrapper[4834]: E0130 21:17:53.530611 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:53 crc kubenswrapper[4834]: I0130 21:17:53.531580 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.358191 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/3.log" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.361307 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerStarted","Data":"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094"} Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.361695 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.409649 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podStartSLOduration=104.409588708 podStartE2EDuration="1m44.409588708s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:17:54.409218947 +0000 UTC m=+125.562365085" watchObservedRunningTime="2026-01-30 21:17:54.409588708 +0000 UTC m=+125.562734896" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.529917 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.529974 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.530053 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:54 crc kubenswrapper[4834]: E0130 21:17:54.530166 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:54 crc kubenswrapper[4834]: E0130 21:17:54.530355 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:54 crc kubenswrapper[4834]: E0130 21:17:54.530549 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:54 crc kubenswrapper[4834]: I0130 21:17:54.535111 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j5pcw"] Jan 30 21:17:54 crc kubenswrapper[4834]: E0130 21:17:54.666356 4834 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:17:55 crc kubenswrapper[4834]: I0130 21:17:55.365287 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:55 crc kubenswrapper[4834]: E0130 21:17:55.365494 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:55 crc kubenswrapper[4834]: I0130 21:17:55.530300 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:55 crc kubenswrapper[4834]: E0130 21:17:55.530832 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:55 crc kubenswrapper[4834]: I0130 21:17:55.531019 4834 scope.go:117] "RemoveContainer" containerID="280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349" Jan 30 21:17:56 crc kubenswrapper[4834]: I0130 21:17:56.371825 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/1.log" Jan 30 21:17:56 crc kubenswrapper[4834]: I0130 21:17:56.371931 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerStarted","Data":"1d1bb595b13953c5708f441831dee60e73b95ed8a4ad7deae34f1eb003a5eb62"} Jan 30 21:17:56 crc kubenswrapper[4834]: I0130 21:17:56.530623 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:56 crc kubenswrapper[4834]: I0130 21:17:56.530641 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:56 crc kubenswrapper[4834]: E0130 21:17:56.530832 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:56 crc kubenswrapper[4834]: E0130 21:17:56.530949 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:57 crc kubenswrapper[4834]: I0130 21:17:57.531174 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:57 crc kubenswrapper[4834]: I0130 21:17:57.531238 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:57 crc kubenswrapper[4834]: E0130 21:17:57.531438 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:17:57 crc kubenswrapper[4834]: E0130 21:17:57.531541 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:58 crc kubenswrapper[4834]: I0130 21:17:58.530648 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:17:58 crc kubenswrapper[4834]: I0130 21:17:58.530675 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:17:58 crc kubenswrapper[4834]: E0130 21:17:58.532556 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 30 21:17:58 crc kubenswrapper[4834]: E0130 21:17:58.532702 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 30 21:17:59 crc kubenswrapper[4834]: I0130 21:17:59.530795 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:17:59 crc kubenswrapper[4834]: I0130 21:17:59.530891 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:17:59 crc kubenswrapper[4834]: E0130 21:17:59.532741 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j5pcw" podUID="f8a589ab-0e20-4c47-a923-363b3be97b20" Jan 30 21:17:59 crc kubenswrapper[4834]: E0130 21:17:59.532865 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 30 21:18:00 crc kubenswrapper[4834]: I0130 21:18:00.530660 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:18:00 crc kubenswrapper[4834]: I0130 21:18:00.530700 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:18:00 crc kubenswrapper[4834]: I0130 21:18:00.535178 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 21:18:00 crc kubenswrapper[4834]: I0130 21:18:00.535251 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 21:18:00 crc kubenswrapper[4834]: I0130 21:18:00.535297 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 21:18:00 crc kubenswrapper[4834]: I0130 21:18:00.535334 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 21:18:01 crc kubenswrapper[4834]: I0130 21:18:01.530222 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:18:01 crc kubenswrapper[4834]: I0130 21:18:01.530231 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:18:01 crc kubenswrapper[4834]: I0130 21:18:01.534643 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 21:18:01 crc kubenswrapper[4834]: I0130 21:18:01.535092 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 21:18:06 crc kubenswrapper[4834]: I0130 21:18:06.510328 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.722215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.772320 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.773138 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.776347 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.777248 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.778015 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xvvtx"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.779196 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.792293 4834 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.792457 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.793658 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wp8vc"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.798279 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.799664 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.800471 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.800814 4834 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.803301 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.812053 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.831854 4834 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.831952 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.832036 4834 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.832061 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.832082 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.832218 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.832427 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.832942 4834 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.832976 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.833041 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.833179 4834 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.833209 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.833217 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5qhdb"] Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.833493 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.834007 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:08 crc kubenswrapper[4834]: W0130 21:18:08.833234 4834 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Jan 30 21:18:08 crc kubenswrapper[4834]: E0130 21:18:08.834449 4834 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.834015 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.834752 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.834042 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.835106 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.835718 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.838745 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.839145 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.839368 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.842353 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.842600 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.853957 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.854912 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.854945 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zrxr5"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.855509 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.856031 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.863447 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.864194 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.865710 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.865888 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.865974 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.866065 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.866147 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.866306 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.867640 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.869914 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.870906 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.869932 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.871872 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.871911 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872012 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872077 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872130 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872079 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872176 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872138 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872313 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872434 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872475 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872556 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.872675 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.875660 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.878199 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.878242 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.878526 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.878756 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.878972 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879049 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879165 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879423 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879613 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879736 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879773 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.879845 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.882203 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.882557 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.882752 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5qhdb"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.882881 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.889770 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.896812 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.896877 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdklj\" (UniqueName: \"kubernetes.io/projected/fff76748-e7a2-446a-bb45-cf27a4a4e79f-kube-api-access-zdklj\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.896913 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d26c2\" (UniqueName: \"kubernetes.io/projected/b085cfec-e382-48c5-a623-679412c5b97e-kube-api-access-d26c2\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.896949 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897000 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897032 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897078 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fff76748-e7a2-446a-bb45-cf27a4a4e79f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897113 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897137 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897168 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897259 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897286 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-dir\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897316 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897348 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897451 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897476 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tllls\" (UniqueName: \"kubernetes.io/projected/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-kube-api-access-tllls\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897531 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897572 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-policies\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897641 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.897680 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff76748-e7a2-446a-bb45-cf27a4a4e79f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.901179 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.902134 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.902810 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpqw6"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.905302 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.906981 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-25hvk"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.907298 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.909614 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4rjd7"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.910291 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.910577 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.911136 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.911292 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.911452 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.911666 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.912186 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.912530 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.912792 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.917092 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.917751 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.917985 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.918179 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.918437 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.918615 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.918778 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.922284 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.925004 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.925779 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.926372 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.926952 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.927011 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.927165 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.927297 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4k6d4"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.927854 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.928706 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wp8vc"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.931461 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xvvtx"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.933507 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.935149 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.935297 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.935490 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.935757 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.935881 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.935953 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.938819 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.936902 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.942333 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.942650 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.942957 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.943106 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.944133 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.944372 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.944926 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.944949 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4ngsh"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.945412 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.945929 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.946467 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.946757 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.946983 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.951781 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.952293 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.952701 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.953119 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.953697 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.953849 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.957033 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.957166 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.957369 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.957460 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.957495 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6n9gz"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.958027 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.958207 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.958428 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.958489 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.958620 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.959179 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pfvpm"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.959381 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.959953 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.960548 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vpnmb"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.985694 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.985914 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.986043 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.986409 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.988946 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.993494 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.994676 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-88rvm"] Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.995950 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.996867 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:08 crc kubenswrapper[4834]: I0130 21:18:08.997297 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.000457 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.002451 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.004859 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006051 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006093 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7ft\" (UniqueName: \"kubernetes.io/projected/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-kube-api-access-zm7ft\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006149 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006170 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4ksx\" (UniqueName: \"kubernetes.io/projected/e4abe8be-aa7d-46ad-a658-259955e42044-kube-api-access-c4ksx\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006192 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-serving-cert\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006240 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4abe8be-aa7d-46ad-a658-259955e42044-serving-cert\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00580dd0-6712-4cd5-b651-a200271e0727-config\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/baf610cb-71c9-4589-8aa1-74bf3030485b-node-pullsecrets\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006334 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006352 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-audit-policies\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006371 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-encryption-config\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006406 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-dir\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krszt\" (UniqueName: \"kubernetes.io/projected/00580dd0-6712-4cd5-b651-a200271e0727-kube-api-access-krszt\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006448 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006466 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006485 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-client-ca\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006503 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/00580dd0-6712-4cd5-b651-a200271e0727-images\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006523 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/00580dd0-6712-4cd5-b651-a200271e0727-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-machine-approver-tls\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006561 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-serving-cert\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006578 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-auth-proxy-config\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006597 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrm7\" (UniqueName: \"kubernetes.io/projected/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-kube-api-access-grrm7\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006617 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tllls\" (UniqueName: \"kubernetes.io/projected/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-kube-api-access-tllls\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006680 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-image-import-ca\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006715 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006736 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-config\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baf610cb-71c9-4589-8aa1-74bf3030485b-audit-dir\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006774 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-etcd-client\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006802 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006822 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-policies\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006855 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006883 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff76748-e7a2-446a-bb45-cf27a4a4e79f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006903 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-etcd-serving-ca\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006922 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-config\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006937 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-etcd-client\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006954 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-encryption-config\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006977 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.006999 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdklj\" (UniqueName: \"kubernetes.io/projected/fff76748-e7a2-446a-bb45-cf27a4a4e79f-kube-api-access-zdklj\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d26c2\" (UniqueName: \"kubernetes.io/projected/b085cfec-e382-48c5-a623-679412c5b97e-kube-api-access-d26c2\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007037 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-config\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007058 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007079 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007100 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-audit\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007138 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fff76748-e7a2-446a-bb45-cf27a4a4e79f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007158 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007177 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-audit-dir\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007197 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58jb7\" (UniqueName: \"kubernetes.io/projected/baf610cb-71c9-4589-8aa1-74bf3030485b-kube-api-access-58jb7\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.007823 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.008473 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-dir\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.008655 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.010543 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-policies\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.016214 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.017232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.017708 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.018577 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.019581 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fff76748-e7a2-446a-bb45-cf27a4a4e79f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.021563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fff76748-e7a2-446a-bb45-cf27a4a4e79f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.021576 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cxssc"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.021799 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.024171 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.025552 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-vmqm2"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.026669 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.026909 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.027607 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.028128 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.028417 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9kdrb"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.033224 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.034025 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.034920 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.056425 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.057239 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.058708 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.059170 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.059301 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.059634 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.057733 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.061025 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-25hvk"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.061123 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.061647 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.061762 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.062496 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zrxr5"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.062622 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.063326 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.065968 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4k6d4"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.067242 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.067976 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.069047 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.070617 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.074453 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6n9gz"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.074660 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gzhfr"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.075992 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5cbv7"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.076597 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.076687 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.078947 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpqw6"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.080051 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.082801 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.083991 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.084921 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.086042 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4ngsh"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.087057 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.088977 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.090080 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pfvpm"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.091274 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.091694 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4rjd7"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.092761 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.094185 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.095063 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.097646 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.100214 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5cbv7"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.101336 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.102464 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9kdrb"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.103630 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.104689 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gzhfr"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.105781 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.106975 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-88rvm"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107782 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-serving-cert\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107821 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4ksx\" (UniqueName: \"kubernetes.io/projected/e4abe8be-aa7d-46ad-a658-259955e42044-kube-api-access-c4ksx\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107862 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4abe8be-aa7d-46ad-a658-259955e42044-serving-cert\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107891 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00580dd0-6712-4cd5-b651-a200271e0727-config\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107919 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/baf610cb-71c9-4589-8aa1-74bf3030485b-node-pullsecrets\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107944 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-audit-policies\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.107962 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-encryption-config\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krszt\" (UniqueName: \"kubernetes.io/projected/00580dd0-6712-4cd5-b651-a200271e0727-kube-api-access-krszt\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108021 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/baf610cb-71c9-4589-8aa1-74bf3030485b-node-pullsecrets\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108043 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-client-ca\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108060 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/00580dd0-6712-4cd5-b651-a200271e0727-images\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/00580dd0-6712-4cd5-b651-a200271e0727-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108093 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-machine-approver-tls\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108111 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-serving-cert\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108127 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-auth-proxy-config\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108173 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grrm7\" (UniqueName: \"kubernetes.io/projected/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-kube-api-access-grrm7\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108196 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-image-import-ca\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-config\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108242 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baf610cb-71c9-4589-8aa1-74bf3030485b-audit-dir\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108259 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-etcd-client\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108276 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108298 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-etcd-serving-ca\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108313 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-config\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108329 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-etcd-client\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108344 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-encryption-config\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108365 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-config\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108416 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-audit\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108432 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108476 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-audit-dir\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108492 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108510 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58jb7\" (UniqueName: \"kubernetes.io/projected/baf610cb-71c9-4589-8aa1-74bf3030485b-kube-api-access-58jb7\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7ft\" (UniqueName: \"kubernetes.io/projected/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-kube-api-access-zm7ft\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108746 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00580dd0-6712-4cd5-b651-a200271e0727-config\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.108766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/00580dd0-6712-4cd5-b651-a200271e0727-images\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.109256 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-audit-policies\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.109617 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-etcd-serving-ca\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.109643 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-audit\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.109837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-client-ca\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.109907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-audit-dir\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-config\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110612 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-image-import-ca\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110616 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-config\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110675 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.110882 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baf610cb-71c9-4589-8aa1-74bf3030485b-audit-dir\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.111014 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf610cb-71c9-4589-8aa1-74bf3030485b-config\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.111053 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vpnmb"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.111106 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-encryption-config\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.111185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-auth-proxy-config\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.111711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-etcd-client\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.112058 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/00580dd0-6712-4cd5-b651-a200271e0727-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.112604 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-serving-cert\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.112848 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9c6nm"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.113088 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-encryption-config\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.113703 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.114371 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.114544 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-serving-cert\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.114780 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.116170 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.117728 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9c6nm"] Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.118199 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/baf610cb-71c9-4589-8aa1-74bf3030485b-etcd-client\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.119742 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-machine-approver-tls\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.120139 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4abe8be-aa7d-46ad-a658-259955e42044-serving-cert\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.134743 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.155370 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.175656 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.195948 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.215592 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.235886 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.255594 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.275668 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.296496 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.315701 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.335887 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.355751 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.374964 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.395825 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.416557 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.436697 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.455255 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.475417 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.495481 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.515478 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.556068 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.575507 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.595490 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.615241 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.636313 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.655565 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.675626 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.697935 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.716018 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.735726 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.755864 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.785328 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.795474 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.817583 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.835707 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.856800 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.876024 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.896158 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.916829 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.966327 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.966462 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.975559 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 21:18:09 crc kubenswrapper[4834]: I0130 21:18:09.995350 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.009716 4834 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.009766 4834 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.009823 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert podName:b085cfec-e382-48c5-a623-679412c5b97e nodeName:}" failed. No retries permitted until 2026-01-30 21:18:10.50979406 +0000 UTC m=+141.662940238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert") pod "route-controller-manager-6576b87f9c-gndbg" (UID: "b085cfec-e382-48c5-a623-679412c5b97e") : failed to sync secret cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.009865 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session podName:719a9eb0-8eb3-4fe7-888a-a1e9a426ed68 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:10.509837651 +0000 UTC m=+141.662983819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session") pod "oauth-openshift-558db77b4-xvvtx" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68") : failed to sync secret cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.012274 4834 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.012327 4834 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.012501 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config podName:b085cfec-e382-48c5-a623-679412c5b97e nodeName:}" failed. No retries permitted until 2026-01-30 21:18:10.512465779 +0000 UTC m=+141.665612017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config") pod "route-controller-manager-6576b87f9c-gndbg" (UID: "b085cfec-e382-48c5-a623-679412c5b97e") : failed to sync configmap cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.012548 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca podName:b085cfec-e382-48c5-a623-679412c5b97e nodeName:}" failed. No retries permitted until 2026-01-30 21:18:10.51252837 +0000 UTC m=+141.665674548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca") pod "route-controller-manager-6576b87f9c-gndbg" (UID: "b085cfec-e382-48c5-a623-679412c5b97e") : failed to sync configmap cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.014126 4834 request.go:700] Waited for 1.016678538s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dservice-ca-operator-dockercfg-rg9jl&limit=500&resourceVersion=0 Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.015994 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.017085 4834 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: E0130 21:18:10.017160 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca podName:719a9eb0-8eb3-4fe7-888a-a1e9a426ed68 nodeName:}" failed. No retries permitted until 2026-01-30 21:18:10.517138956 +0000 UTC m=+141.670285134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-xvvtx" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68") : failed to sync configmap cache: timed out waiting for the condition Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.036474 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.076276 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.095931 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.116251 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.156377 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.161553 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tllls\" (UniqueName: \"kubernetes.io/projected/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-kube-api-access-tllls\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.176949 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.195928 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.216641 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.236221 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.256340 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.276052 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.323918 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdklj\" (UniqueName: \"kubernetes.io/projected/fff76748-e7a2-446a-bb45-cf27a4a4e79f-kube-api-access-zdklj\") pod \"openshift-apiserver-operator-796bbdcf4f-brd2r\" (UID: \"fff76748-e7a2-446a-bb45-cf27a4a4e79f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.336501 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.355835 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.365942 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.375921 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.395729 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.415728 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.435866 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.455950 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.475821 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.500798 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.516780 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.527937 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.528494 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.528612 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.528646 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.528708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.535387 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.555592 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.577353 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.596388 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.620646 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r"] Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.624574 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.635232 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.656160 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.676281 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.695687 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.715685 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.735898 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.755354 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.775933 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.796223 4834 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.815814 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.836576 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.855636 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.876613 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.895533 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.943227 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4ksx\" (UniqueName: \"kubernetes.io/projected/e4abe8be-aa7d-46ad-a658-259955e42044-kube-api-access-c4ksx\") pod \"controller-manager-879f6c89f-zrxr5\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.964016 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7ft\" (UniqueName: \"kubernetes.io/projected/cbcfc442-8f5f-4da5-9289-98484d7c0cb3-kube-api-access-zm7ft\") pod \"apiserver-7bbb656c7d-fqv56\" (UID: \"cbcfc442-8f5f-4da5-9289-98484d7c0cb3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:10 crc kubenswrapper[4834]: I0130 21:18:10.975413 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krszt\" (UniqueName: \"kubernetes.io/projected/00580dd0-6712-4cd5-b651-a200271e0727-kube-api-access-krszt\") pod \"machine-api-operator-5694c8668f-5qhdb\" (UID: \"00580dd0-6712-4cd5-b651-a200271e0727\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.006764 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58jb7\" (UniqueName: \"kubernetes.io/projected/baf610cb-71c9-4589-8aa1-74bf3030485b-kube-api-access-58jb7\") pod \"apiserver-76f77b778f-wp8vc\" (UID: \"baf610cb-71c9-4589-8aa1-74bf3030485b\") " pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.014551 4834 request.go:700] Waited for 1.900613492s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.016662 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.022682 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.023947 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrm7\" (UniqueName: \"kubernetes.io/projected/e8a4b745-1818-4728-b0f8-e1e8eed70e6b-kube-api-access-grrm7\") pod \"machine-approver-56656f9798-qb6gd\" (UID: \"e8a4b745-1818-4728-b0f8-e1e8eed70e6b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.034355 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.037741 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.042853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.063267 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 21:18:11 crc kubenswrapper[4834]: W0130 21:18:11.072680 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a4b745_1818_4728_b0f8_e1e8eed70e6b.slice/crio-f24dbb0cebbed9a221ac94c35c391f3c6665dbd6cea808f79b2337535352d399 WatchSource:0}: Error finding container f24dbb0cebbed9a221ac94c35c391f3c6665dbd6cea808f79b2337535352d399: Status 404 returned error can't find the container with id f24dbb0cebbed9a221ac94c35c391f3c6665dbd6cea808f79b2337535352d399 Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.113194 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.117188 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.117343 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.126685 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d26c2\" (UniqueName: \"kubernetes.io/projected/b085cfec-e382-48c5-a623-679412c5b97e-kube-api-access-d26c2\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.137196 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.144906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.175843 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.179996 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.181165 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.195339 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.201622 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xvvtx\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.218057 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.219440 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.236762 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238336 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12d3ad25-9e32-4467-8a93-43dfee213499-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4ngsh\" (UID: \"12d3ad25-9e32-4467-8a93-43dfee213499\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238376 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-config\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238435 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1595dff8-9195-4ab2-83ed-5ccd825b21eb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238493 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-trusted-ca\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238683 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d66829f-7076-4ef3-8812-3b37496abd89-metrics-tls\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238710 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b8d762-5d1f-4632-9685-681294725b38-serving-cert\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238734 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8c5\" (UniqueName: \"kubernetes.io/projected/55022479-2799-4d9b-b7a2-b2aa98aad754-kube-api-access-vn8c5\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238774 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58995455-5f53-49bb-84e7-dab094ffec5b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238798 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-serving-cert\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-etcd-service-ca\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238841 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxgxz\" (UniqueName: \"kubernetes.io/projected/151c367e-d6c2-4433-a401-2b6390b4ce09-kube-api-access-jxgxz\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238864 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc56p\" (UniqueName: \"kubernetes.io/projected/aae73693-2f17-4d81-9e1e-f510035bd84f-kube-api-access-lc56p\") pod \"downloads-7954f5f757-25hvk\" (UID: \"aae73693-2f17-4d81-9e1e-f510035bd84f\") " pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238887 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cfx5\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-kube-api-access-2cfx5\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238923 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqspm\" (UniqueName: \"kubernetes.io/projected/ee1bf6ec-599d-4a04-b05b-32db37d474b8-kube-api-access-jqspm\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238946 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g99h\" (UniqueName: \"kubernetes.io/projected/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-kube-api-access-2g99h\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238968 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55022479-2799-4d9b-b7a2-b2aa98aad754-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.238990 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-oauth-config\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239011 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641ade1d-9c8e-4b46-b599-32b1165e7528-config\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239046 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcp4\" (UniqueName: \"kubernetes.io/projected/dd752012-3171-434a-a470-ed59fe9a382d-kube-api-access-6qcp4\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqjxw\" (UniqueName: \"kubernetes.io/projected/207681cc-7b97-4401-89ce-2fa68270a9be-kube-api-access-vqjxw\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239104 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-config\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239127 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/207681cc-7b97-4401-89ce-2fa68270a9be-signing-key\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239186 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641ade1d-9c8e-4b46-b599-32b1165e7528-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239223 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1bf6ec-599d-4a04-b05b-32db37d474b8-config\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239247 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573e1765-0165-4d38-840b-04b29206758f-proxy-tls\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239270 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/96b8d762-5d1f-4632-9685-681294725b38-etcd-client\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239352 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd752012-3171-434a-a470-ed59fe9a382d-serving-cert\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.239380 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb3210b-e06b-45f5-9df7-2a6b8e322223-metrics-tls\") pod \"dns-operator-744455d44c-vpnmb\" (UID: \"7bb3210b-e06b-45f5-9df7-2a6b8e322223\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.240659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkw4w\" (UniqueName: \"kubernetes.io/projected/6362617b-267a-4763-835f-b77935ceec53-kube-api-access-bkw4w\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.240725 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58995455-5f53-49bb-84e7-dab094ffec5b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d66829f-7076-4ef3-8812-3b37496abd89-trusted-ca\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241446 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/207681cc-7b97-4401-89ce-2fa68270a9be-signing-cabundle\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241494 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151c367e-d6c2-4433-a401-2b6390b4ce09-serving-cert\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241519 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241541 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-etcd-ca\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241563 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8b9q\" (UniqueName: \"kubernetes.io/projected/42c52fd3-a2bb-4345-b047-592d8525bf76-kube-api-access-t8b9q\") pod \"migrator-59844c95c7-7x57c\" (UID: \"42c52fd3-a2bb-4345-b047-592d8525bf76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241585 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzlq\" (UniqueName: \"kubernetes.io/projected/7bb3210b-e06b-45f5-9df7-2a6b8e322223-kube-api-access-hnzlq\") pod \"dns-operator-744455d44c-vpnmb\" (UID: \"7bb3210b-e06b-45f5-9df7-2a6b8e322223\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241618 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/573e1765-0165-4d38-840b-04b29206758f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1bf6ec-599d-4a04-b05b-32db37d474b8-trusted-ca\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241673 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-config\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241704 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6362617b-267a-4763-835f-b77935ceec53-serving-cert\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241724 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2fp\" (UniqueName: \"kubernetes.io/projected/96b8d762-5d1f-4632-9685-681294725b38-kube-api-access-7h2fp\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7c1a69-33bc-4cbe-933f-905775505373-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241780 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7c1a69-33bc-4cbe-933f-905775505373-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.241800 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpf6s\" (UniqueName: \"kubernetes.io/projected/12d3ad25-9e32-4467-8a93-43dfee213499-kube-api-access-cpf6s\") pod \"multus-admission-controller-857f4d67dd-4ngsh\" (UID: \"12d3ad25-9e32-4467-8a93-43dfee213499\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242173 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb7c1a69-33bc-4cbe-933f-905775505373-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242206 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnn5m\" (UniqueName: \"kubernetes.io/projected/573e1765-0165-4d38-840b-04b29206758f-kube-api-access-nnn5m\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242229 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/55022479-2799-4d9b-b7a2-b2aa98aad754-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242252 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/011bab8f-7841-4e99-8d47-ee7ed71b9ec5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xqnhs\" (UID: \"011bab8f-7841-4e99-8d47-ee7ed71b9ec5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-registry-tls\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242296 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151c367e-d6c2-4433-a401-2b6390b4ce09-config\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242330 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-trusted-ca-bundle\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242350 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58dh2\" (UniqueName: \"kubernetes.io/projected/59b3b974-9ba8-426b-8836-34ecbb56f86f-kube-api-access-58dh2\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242412 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae1a469-a830-4aef-bb3b-eb786efa1078-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242434 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdjxg\" (UniqueName: \"kubernetes.io/projected/6d66829f-7076-4ef3-8812-3b37496abd89-kube-api-access-bdjxg\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242455 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dd752012-3171-434a-a470-ed59fe9a382d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242474 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1595dff8-9195-4ab2-83ed-5ccd825b21eb-config\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242493 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242517 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssvj\" (UniqueName: \"kubernetes.io/projected/011bab8f-7841-4e99-8d47-ee7ed71b9ec5-kube-api-access-kssvj\") pod \"control-plane-machine-set-operator-78cbb6b69f-xqnhs\" (UID: \"011bab8f-7841-4e99-8d47-ee7ed71b9ec5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-oauth-serving-cert\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242559 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwht\" (UniqueName: \"kubernetes.io/projected/a57e91ba-e053-4f9d-bf7f-3a5e0f400d79-kube-api-access-gvwht\") pod \"cluster-samples-operator-665b6dd947-kx4ct\" (UID: \"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242578 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrwl\" (UniqueName: \"kubernetes.io/projected/f3246e84-3def-488f-8a8f-069bdc3fa563-kube-api-access-2lrwl\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242598 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-registry-certificates\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242636 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242700 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1595dff8-9195-4ab2-83ed-5ccd825b21eb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242722 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3246e84-3def-488f-8a8f-069bdc3fa563-secret-volume\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.242744 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55022479-2799-4d9b-b7a2-b2aa98aad754-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.243575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/573e1765-0165-4d38-840b-04b29206758f-images\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.243681 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ae1a469-a830-4aef-bb3b-eb786efa1078-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.243707 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57e91ba-e053-4f9d-bf7f-3a5e0f400d79-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kx4ct\" (UID: \"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.243736 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3246e84-3def-488f-8a8f-069bdc3fa563-config-volume\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.243752 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/641ade1d-9c8e-4b46-b599-32b1165e7528-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.243962 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:11.743942903 +0000 UTC m=+142.897089141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.244026 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rcx\" (UniqueName: \"kubernetes.io/projected/1ae1a469-a830-4aef-bb3b-eb786efa1078-kube-api-access-r9rcx\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.244262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-service-ca\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.244490 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.244538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1bf6ec-599d-4a04-b05b-32db37d474b8-serving-cert\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.244575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-bound-sa-token\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.244631 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d66829f-7076-4ef3-8812-3b37496abd89-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.252246 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert\") pod \"route-controller-manager-6576b87f9c-gndbg\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.278227 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5qhdb"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.290637 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.299516 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346485 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.346657 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:11.846627123 +0000 UTC m=+142.999773281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-registry-tls\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346759 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151c367e-d6c2-4433-a401-2b6390b4ce09-config\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346797 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58dh2\" (UniqueName: \"kubernetes.io/projected/59b3b974-9ba8-426b-8836-34ecbb56f86f-kube-api-access-58dh2\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346831 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/99b4691a-d300-46e1-9ca1-ea8287465be8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-trusted-ca-bundle\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346886 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e640e7b-4b48-456a-a65a-31c1e2047222-cert\") pod \"ingress-canary-5cbv7\" (UID: \"0e640e7b-4b48-456a-a65a-31c1e2047222\") " pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae1a469-a830-4aef-bb3b-eb786efa1078-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346937 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrwd\" (UniqueName: \"kubernetes.io/projected/bf7807b8-0393-4e17-ad63-e598aa25593e-kube-api-access-wwrwd\") pod \"package-server-manager-789f6589d5-fq6x5\" (UID: \"bf7807b8-0393-4e17-ad63-e598aa25593e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdjxg\" (UniqueName: \"kubernetes.io/projected/6d66829f-7076-4ef3-8812-3b37496abd89-kube-api-access-bdjxg\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.346988 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kssvj\" (UniqueName: \"kubernetes.io/projected/011bab8f-7841-4e99-8d47-ee7ed71b9ec5-kube-api-access-kssvj\") pod \"control-plane-machine-set-operator-78cbb6b69f-xqnhs\" (UID: \"011bab8f-7841-4e99-8d47-ee7ed71b9ec5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347048 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dd752012-3171-434a-a470-ed59fe9a382d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1595dff8-9195-4ab2-83ed-5ccd825b21eb-config\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-csi-data-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-oauth-serving-cert\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347141 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-registry-certificates\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347169 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwht\" (UniqueName: \"kubernetes.io/projected/a57e91ba-e053-4f9d-bf7f-3a5e0f400d79-kube-api-access-gvwht\") pod \"cluster-samples-operator-665b6dd947-kx4ct\" (UID: \"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347200 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrwl\" (UniqueName: \"kubernetes.io/projected/f3246e84-3def-488f-8a8f-069bdc3fa563-kube-api-access-2lrwl\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347238 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347273 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3246e84-3def-488f-8a8f-069bdc3fa563-secret-volume\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347296 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55022479-2799-4d9b-b7a2-b2aa98aad754-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347323 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1595dff8-9195-4ab2-83ed-5ccd825b21eb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347354 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-mountpoint-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347495 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/573e1765-0165-4d38-840b-04b29206758f-images\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347528 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347555 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57e91ba-e053-4f9d-bf7f-3a5e0f400d79-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kx4ct\" (UID: \"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347578 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3246e84-3def-488f-8a8f-069bdc3fa563-config-volume\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347602 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/641ade1d-9c8e-4b46-b599-32b1165e7528-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347626 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ae1a469-a830-4aef-bb3b-eb786efa1078-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347652 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rcx\" (UniqueName: \"kubernetes.io/projected/1ae1a469-a830-4aef-bb3b-eb786efa1078-kube-api-access-r9rcx\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347681 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-metrics-certs\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347707 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-service-ca\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347812 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1bf6ec-599d-4a04-b05b-32db37d474b8-serving-cert\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347886 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-bound-sa-token\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347914 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-config-volume\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347937 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28198978-fb02-4208-9967-2c6ac9258439-webhook-cert\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347971 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d66829f-7076-4ef3-8812-3b37496abd89-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.347996 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nb2\" (UniqueName: \"kubernetes.io/projected/99b4691a-d300-46e1-9ca1-ea8287465be8-kube-api-access-w9nb2\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348054 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12d3ad25-9e32-4467-8a93-43dfee213499-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4ngsh\" (UID: \"12d3ad25-9e32-4467-8a93-43dfee213499\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348107 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/28198978-fb02-4208-9967-2c6ac9258439-tmpfs\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djl5x\" (UniqueName: \"kubernetes.io/projected/aab9a42d-c833-46b2-a745-1bb95ada7f68-kube-api-access-djl5x\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348157 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-config\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348179 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1595dff8-9195-4ab2-83ed-5ccd825b21eb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348201 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwx4v\" (UniqueName: \"kubernetes.io/projected/03fad3ad-d98f-420e-be43-8081c20dd6d4-kube-api-access-gwx4v\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348264 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348292 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348320 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-trusted-ca\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348347 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d66829f-7076-4ef3-8812-3b37496abd89-metrics-tls\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348373 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b8d762-5d1f-4632-9685-681294725b38-serving-cert\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348417 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8c5\" (UniqueName: \"kubernetes.io/projected/55022479-2799-4d9b-b7a2-b2aa98aad754-kube-api-access-vn8c5\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348445 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58995455-5f53-49bb-84e7-dab094ffec5b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-serving-cert\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348502 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-registration-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348694 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/dd752012-3171-434a-a470-ed59fe9a382d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.348934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/573e1765-0165-4d38-840b-04b29206758f-images\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.349218 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:11.849203288 +0000 UTC m=+143.002349426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.349250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151c367e-d6c2-4433-a401-2b6390b4ce09-config\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.349762 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-service-ca\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.350284 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-oauth-serving-cert\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.350532 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-config\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.350642 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-registry-certificates\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.352010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-registry-tls\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.352662 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ae1a469-a830-4aef-bb3b-eb786efa1078-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.352704 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.352863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.353309 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1595dff8-9195-4ab2-83ed-5ccd825b21eb-config\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.354317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-service-ca-bundle\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.354388 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-trusted-ca-bundle\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.354917 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58995455-5f53-49bb-84e7-dab094ffec5b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355674 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxgxz\" (UniqueName: \"kubernetes.io/projected/151c367e-d6c2-4433-a401-2b6390b4ce09-kube-api-access-jxgxz\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355713 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-default-certificate\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355770 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c568ee8a-e57b-46b4-9852-277e61360e02-node-bootstrap-token\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-etcd-service-ca\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355835 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nh9k\" (UniqueName: \"kubernetes.io/projected/c568ee8a-e57b-46b4-9852-277e61360e02-kube-api-access-5nh9k\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355859 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc56p\" (UniqueName: \"kubernetes.io/projected/aae73693-2f17-4d81-9e1e-f510035bd84f-kube-api-access-lc56p\") pod \"downloads-7954f5f757-25hvk\" (UID: \"aae73693-2f17-4d81-9e1e-f510035bd84f\") " pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355881 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bd7\" (UniqueName: \"kubernetes.io/projected/04d66c5d-67d0-49dd-8777-44d9bc45090b-kube-api-access-49bd7\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355905 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c568ee8a-e57b-46b4-9852-277e61360e02-certs\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355926 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wp8vc"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355959 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/15134023-081c-4cdd-bf87-dc07d86f3dfd-srv-cert\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355990 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cfx5\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-kube-api-access-2cfx5\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.355994 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3246e84-3def-488f-8a8f-069bdc3fa563-config-volume\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356014 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqspm\" (UniqueName: \"kubernetes.io/projected/ee1bf6ec-599d-4a04-b05b-32db37d474b8-kube-api-access-jqspm\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356121 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g99h\" (UniqueName: \"kubernetes.io/projected/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-kube-api-access-2g99h\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356230 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55022479-2799-4d9b-b7a2-b2aa98aad754-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356290 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-oauth-config\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356314 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641ade1d-9c8e-4b46-b599-32b1165e7528-config\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356553 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-etcd-service-ca\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356662 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-trusted-ca\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356759 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqjxw\" (UniqueName: \"kubernetes.io/projected/207681cc-7b97-4401-89ce-2fa68270a9be-kube-api-access-vqjxw\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641ade1d-9c8e-4b46-b599-32b1165e7528-config\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.356890 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcp4\" (UniqueName: \"kubernetes.io/projected/dd752012-3171-434a-a470-ed59fe9a382d-kube-api-access-6qcp4\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.357302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkqd\" (UniqueName: \"kubernetes.io/projected/57726227-3cf1-4553-9b60-63e2082c887d-kube-api-access-xlkqd\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.357446 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-config\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.357481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/99b4691a-d300-46e1-9ca1-ea8287465be8-srv-cert\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.357521 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/207681cc-7b97-4401-89ce-2fa68270a9be-signing-key\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.357542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641ade1d-9c8e-4b46-b599-32b1165e7528-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.357585 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04d66c5d-67d0-49dd-8777-44d9bc45090b-proxy-tls\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.358017 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573e1765-0165-4d38-840b-04b29206758f-proxy-tls\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.358049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/96b8d762-5d1f-4632-9685-681294725b38-etcd-client\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.358074 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57726227-3cf1-4553-9b60-63e2082c887d-service-ca-bundle\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.358124 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1bf6ec-599d-4a04-b05b-32db37d474b8-config\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.358908 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-config\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.359203 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1bf6ec-599d-4a04-b05b-32db37d474b8-config\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.359450 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-serving-cert\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.359550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb3210b-e06b-45f5-9df7-2a6b8e322223-metrics-tls\") pod \"dns-operator-744455d44c-vpnmb\" (UID: \"7bb3210b-e06b-45f5-9df7-2a6b8e322223\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.359589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7807b8-0393-4e17-ad63-e598aa25593e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fq6x5\" (UID: \"bf7807b8-0393-4e17-ad63-e598aa25593e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.359615 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-plugins-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.359652 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd752012-3171-434a-a470-ed59fe9a382d-serving-cert\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.360873 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55022479-2799-4d9b-b7a2-b2aa98aad754-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.361317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ae1a469-a830-4aef-bb3b-eb786efa1078-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362209 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkw4w\" (UniqueName: \"kubernetes.io/projected/6362617b-267a-4763-835f-b77935ceec53-kube-api-access-bkw4w\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362475 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58995455-5f53-49bb-84e7-dab094ffec5b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362532 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4ws\" (UniqueName: \"kubernetes.io/projected/28198978-fb02-4208-9967-2c6ac9258439-kube-api-access-nz4ws\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362558 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-socket-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d66829f-7076-4ef3-8812-3b37496abd89-trusted-ca\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/207681cc-7b97-4401-89ce-2fa68270a9be-signing-cabundle\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362687 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151c367e-d6c2-4433-a401-2b6390b4ce09-serving-cert\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362730 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.362762 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-etcd-ca\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.366789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d66829f-7076-4ef3-8812-3b37496abd89-trusted-ca\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367188 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/207681cc-7b97-4401-89ce-2fa68270a9be-signing-cabundle\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367358 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8b9q\" (UniqueName: \"kubernetes.io/projected/42c52fd3-a2bb-4345-b047-592d8525bf76-kube-api-access-t8b9q\") pod \"migrator-59844c95c7-7x57c\" (UID: \"42c52fd3-a2bb-4345-b047-592d8525bf76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367611 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151c367e-d6c2-4433-a401-2b6390b4ce09-serving-cert\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367613 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzlq\" (UniqueName: \"kubernetes.io/projected/7bb3210b-e06b-45f5-9df7-2a6b8e322223-kube-api-access-hnzlq\") pod \"dns-operator-744455d44c-vpnmb\" (UID: \"7bb3210b-e06b-45f5-9df7-2a6b8e322223\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367676 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04d66c5d-67d0-49dd-8777-44d9bc45090b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367702 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/15134023-081c-4cdd-bf87-dc07d86f3dfd-profile-collector-cert\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367731 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/573e1765-0165-4d38-840b-04b29206758f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1bf6ec-599d-4a04-b05b-32db37d474b8-trusted-ca\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367872 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-757sd\" (UniqueName: \"kubernetes.io/projected/15134023-081c-4cdd-bf87-dc07d86f3dfd-kube-api-access-757sd\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.367911 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-config\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.368000 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-metrics-tls\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.368036 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjs5\" (UniqueName: \"kubernetes.io/projected/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-kube-api-access-prjs5\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.368067 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2fp\" (UniqueName: \"kubernetes.io/projected/96b8d762-5d1f-4632-9685-681294725b38-kube-api-access-7h2fp\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.368094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-stats-auth\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.368161 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6362617b-267a-4763-835f-b77935ceec53-serving-cert\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.368854 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/96b8d762-5d1f-4632-9685-681294725b38-etcd-ca\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.369786 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/573e1765-0165-4d38-840b-04b29206758f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.369834 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6362617b-267a-4763-835f-b77935ceec53-config\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370099 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b8d762-5d1f-4632-9685-681294725b38-serving-cert\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370151 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpf6s\" (UniqueName: \"kubernetes.io/projected/12d3ad25-9e32-4467-8a93-43dfee213499-kube-api-access-cpf6s\") pod \"multus-admission-controller-857f4d67dd-4ngsh\" (UID: \"12d3ad25-9e32-4467-8a93-43dfee213499\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370196 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28198978-fb02-4208-9967-2c6ac9258439-apiservice-cert\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370221 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8z96\" (UniqueName: \"kubernetes.io/projected/0e640e7b-4b48-456a-a65a-31c1e2047222-kube-api-access-h8z96\") pod \"ingress-canary-5cbv7\" (UID: \"0e640e7b-4b48-456a-a65a-31c1e2047222\") " pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370223 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1bf6ec-599d-4a04-b05b-32db37d474b8-trusted-ca\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370244 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7c1a69-33bc-4cbe-933f-905775505373-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370374 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7c1a69-33bc-4cbe-933f-905775505373-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370437 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/55022479-2799-4d9b-b7a2-b2aa98aad754-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.370909 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7c1a69-33bc-4cbe-933f-905775505373-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.371026 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb7c1a69-33bc-4cbe-933f-905775505373-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.371060 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnn5m\" (UniqueName: \"kubernetes.io/projected/573e1765-0165-4d38-840b-04b29206758f-kube-api-access-nnn5m\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.371080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/011bab8f-7841-4e99-8d47-ee7ed71b9ec5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xqnhs\" (UID: \"011bab8f-7841-4e99-8d47-ee7ed71b9ec5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.372363 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb3210b-e06b-45f5-9df7-2a6b8e322223-metrics-tls\") pod \"dns-operator-744455d44c-vpnmb\" (UID: \"7bb3210b-e06b-45f5-9df7-2a6b8e322223\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.372819 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1595dff8-9195-4ab2-83ed-5ccd825b21eb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.373360 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.375206 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee1bf6ec-599d-4a04-b05b-32db37d474b8-serving-cert\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.375233 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/011bab8f-7841-4e99-8d47-ee7ed71b9ec5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xqnhs\" (UID: \"011bab8f-7841-4e99-8d47-ee7ed71b9ec5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.375244 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd752012-3171-434a-a470-ed59fe9a382d-serving-cert\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.375477 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12d3ad25-9e32-4467-8a93-43dfee213499-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4ngsh\" (UID: \"12d3ad25-9e32-4467-8a93-43dfee213499\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.375726 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641ade1d-9c8e-4b46-b599-32b1165e7528-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.375985 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573e1765-0165-4d38-840b-04b29206758f-proxy-tls\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.376372 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d66829f-7076-4ef3-8812-3b37496abd89-metrics-tls\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.376763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/207681cc-7b97-4401-89ce-2fa68270a9be-signing-key\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.377193 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3246e84-3def-488f-8a8f-069bdc3fa563-secret-volume\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.378055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/96b8d762-5d1f-4632-9685-681294725b38-etcd-client\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.378424 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-oauth-config\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.379525 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a57e91ba-e053-4f9d-bf7f-3a5e0f400d79-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kx4ct\" (UID: \"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.380792 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7c1a69-33bc-4cbe-933f-905775505373-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.381946 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6362617b-267a-4763-835f-b77935ceec53-serving-cert\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.382020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58995455-5f53-49bb-84e7-dab094ffec5b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.386024 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/55022479-2799-4d9b-b7a2-b2aa98aad754-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.401251 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zrxr5"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.404374 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rcx\" (UniqueName: \"kubernetes.io/projected/1ae1a469-a830-4aef-bb3b-eb786efa1078-kube-api-access-r9rcx\") pod \"openshift-controller-manager-operator-756b6f6bc6-n7h7d\" (UID: \"1ae1a469-a830-4aef-bb3b-eb786efa1078\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.419827 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d66829f-7076-4ef3-8812-3b37496abd89-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: W0130 21:18:11.425826 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaf610cb_71c9_4589_8aa1_74bf3030485b.slice/crio-b835b58b60de4d0f2a4725b29f91f506a4d349bb9fd57ce317d7e3b3d3cf4170 WatchSource:0}: Error finding container b835b58b60de4d0f2a4725b29f91f506a4d349bb9fd57ce317d7e3b3d3cf4170: Status 404 returned error can't find the container with id b835b58b60de4d0f2a4725b29f91f506a4d349bb9fd57ce317d7e3b3d3cf4170 Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.435760 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrwl\" (UniqueName: \"kubernetes.io/projected/f3246e84-3def-488f-8a8f-069bdc3fa563-kube-api-access-2lrwl\") pod \"collect-profiles-29496795-rhdv2\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.440590 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" event={"ID":"fff76748-e7a2-446a-bb45-cf27a4a4e79f","Type":"ContainerStarted","Data":"4e5228de7f6afd7b98161b28b8f44fc70a13fa28b712e5b183b482511186872d"} Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.440626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" event={"ID":"fff76748-e7a2-446a-bb45-cf27a4a4e79f","Type":"ContainerStarted","Data":"3eb96db19aed44f6b99bde2329e74d67df11c34282a74335303094f1169c7b19"} Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.441645 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" event={"ID":"e8a4b745-1818-4728-b0f8-e1e8eed70e6b","Type":"ContainerStarted","Data":"f24dbb0cebbed9a221ac94c35c391f3c6665dbd6cea808f79b2337535352d399"} Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.444686 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" event={"ID":"baf610cb-71c9-4589-8aa1-74bf3030485b","Type":"ContainerStarted","Data":"b835b58b60de4d0f2a4725b29f91f506a4d349bb9fd57ce317d7e3b3d3cf4170"} Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.453363 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" event={"ID":"e4abe8be-aa7d-46ad-a658-259955e42044","Type":"ContainerStarted","Data":"a8d089fe33941de8b6ef439c9b223a36b10a46e4c4b10dcffe4b07f584642ab8"} Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.456370 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1595dff8-9195-4ab2-83ed-5ccd825b21eb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bjbm4\" (UID: \"1595dff8-9195-4ab2-83ed-5ccd825b21eb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.461223 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" event={"ID":"00580dd0-6712-4cd5-b651-a200271e0727","Type":"ContainerStarted","Data":"7678afe0d2958c0cc5e713d7d8a25bd33aa98117de2fe220ccdeade6d7e68a7c"} Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.470981 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8c5\" (UniqueName: \"kubernetes.io/projected/55022479-2799-4d9b-b7a2-b2aa98aad754-kube-api-access-vn8c5\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472551 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472779 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/15134023-081c-4cdd-bf87-dc07d86f3dfd-srv-cert\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472809 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bd7\" (UniqueName: \"kubernetes.io/projected/04d66c5d-67d0-49dd-8777-44d9bc45090b-kube-api-access-49bd7\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472838 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c568ee8a-e57b-46b4-9852-277e61360e02-certs\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlkqd\" (UniqueName: \"kubernetes.io/projected/57726227-3cf1-4553-9b60-63e2082c887d-kube-api-access-xlkqd\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472963 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/99b4691a-d300-46e1-9ca1-ea8287465be8-srv-cert\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.472992 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04d66c5d-67d0-49dd-8777-44d9bc45090b-proxy-tls\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.473040 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57726227-3cf1-4553-9b60-63e2082c887d-service-ca-bundle\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.473084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7807b8-0393-4e17-ad63-e598aa25593e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fq6x5\" (UID: \"bf7807b8-0393-4e17-ad63-e598aa25593e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.473114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-plugins-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.473679 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:11.973663018 +0000 UTC m=+143.126809156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.473720 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4ws\" (UniqueName: \"kubernetes.io/projected/28198978-fb02-4208-9967-2c6ac9258439-kube-api-access-nz4ws\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.473743 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-socket-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474403 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04d66c5d-67d0-49dd-8777-44d9bc45090b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474434 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/15134023-081c-4cdd-bf87-dc07d86f3dfd-profile-collector-cert\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474455 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-757sd\" (UniqueName: \"kubernetes.io/projected/15134023-081c-4cdd-bf87-dc07d86f3dfd-kube-api-access-757sd\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjs5\" (UniqueName: \"kubernetes.io/projected/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-kube-api-access-prjs5\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-metrics-tls\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474551 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-stats-auth\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474583 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28198978-fb02-4208-9967-2c6ac9258439-apiservice-cert\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474608 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8z96\" (UniqueName: \"kubernetes.io/projected/0e640e7b-4b48-456a-a65a-31c1e2047222-kube-api-access-h8z96\") pod \"ingress-canary-5cbv7\" (UID: \"0e640e7b-4b48-456a-a65a-31c1e2047222\") " pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474655 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/99b4691a-d300-46e1-9ca1-ea8287465be8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474674 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e640e7b-4b48-456a-a65a-31c1e2047222-cert\") pod \"ingress-canary-5cbv7\" (UID: \"0e640e7b-4b48-456a-a65a-31c1e2047222\") " pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474696 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrwd\" (UniqueName: \"kubernetes.io/projected/bf7807b8-0393-4e17-ad63-e598aa25593e-kube-api-access-wwrwd\") pod \"package-server-manager-789f6589d5-fq6x5\" (UID: \"bf7807b8-0393-4e17-ad63-e598aa25593e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474743 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-csi-data-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474777 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474817 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-mountpoint-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474838 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474866 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-metrics-certs\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474886 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28198978-fb02-4208-9967-2c6ac9258439-webhook-cert\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-config-volume\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nb2\" (UniqueName: \"kubernetes.io/projected/99b4691a-d300-46e1-9ca1-ea8287465be8-kube-api-access-w9nb2\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474977 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/28198978-fb02-4208-9967-2c6ac9258439-tmpfs\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.474998 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djl5x\" (UniqueName: \"kubernetes.io/projected/aab9a42d-c833-46b2-a745-1bb95ada7f68-kube-api-access-djl5x\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwx4v\" (UniqueName: \"kubernetes.io/projected/03fad3ad-d98f-420e-be43-8081c20dd6d4-kube-api-access-gwx4v\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475047 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475074 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-registration-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-default-certificate\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475134 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c568ee8a-e57b-46b4-9852-277e61360e02-node-bootstrap-token\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475160 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nh9k\" (UniqueName: \"kubernetes.io/projected/c568ee8a-e57b-46b4-9852-277e61360e02-kube-api-access-5nh9k\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.475769 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-plugins-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.476337 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-csi-data-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.476517 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57726227-3cf1-4553-9b60-63e2082c887d-service-ca-bundle\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.476722 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-mountpoint-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.477167 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7807b8-0393-4e17-ad63-e598aa25593e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fq6x5\" (UID: \"bf7807b8-0393-4e17-ad63-e598aa25593e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.477385 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04d66c5d-67d0-49dd-8777-44d9bc45090b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.477789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.478668 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-socket-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.484020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28198978-fb02-4208-9967-2c6ac9258439-webhook-cert\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.484555 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-config-volume\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.484891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/28198978-fb02-4208-9967-2c6ac9258439-tmpfs\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.485151 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:11.985139946 +0000 UTC m=+143.138286084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.487132 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04d66c5d-67d0-49dd-8777-44d9bc45090b-proxy-tls\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.487949 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-metrics-certs\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.489232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/99b4691a-d300-46e1-9ca1-ea8287465be8-srv-cert\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.493510 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e640e7b-4b48-456a-a65a-31c1e2047222-cert\") pod \"ingress-canary-5cbv7\" (UID: \"0e640e7b-4b48-456a-a65a-31c1e2047222\") " pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.493614 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-stats-auth\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.493985 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwht\" (UniqueName: \"kubernetes.io/projected/a57e91ba-e053-4f9d-bf7f-3a5e0f400d79-kube-api-access-gvwht\") pod \"cluster-samples-operator-665b6dd947-kx4ct\" (UID: \"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.494160 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-metrics-tls\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.495900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c568ee8a-e57b-46b4-9852-277e61360e02-certs\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.496447 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03fad3ad-d98f-420e-be43-8081c20dd6d4-registration-dir\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.499370 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/15134023-081c-4cdd-bf87-dc07d86f3dfd-srv-cert\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.499389 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c568ee8a-e57b-46b4-9852-277e61360e02-node-bootstrap-token\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.499518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/15134023-081c-4cdd-bf87-dc07d86f3dfd-profile-collector-cert\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.502646 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/99b4691a-d300-46e1-9ca1-ea8287465be8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.509964 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28198978-fb02-4208-9967-2c6ac9258439-apiservice-cert\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.511804 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.513155 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58dh2\" (UniqueName: \"kubernetes.io/projected/59b3b974-9ba8-426b-8836-34ecbb56f86f-kube-api-access-58dh2\") pod \"console-f9d7485db-4k6d4\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.527383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/57726227-3cf1-4553-9b60-63e2082c887d-default-certificate\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.542701 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdjxg\" (UniqueName: \"kubernetes.io/projected/6d66829f-7076-4ef3-8812-3b37496abd89-kube-api-access-bdjxg\") pod \"ingress-operator-5b745b69d9-x44bc\" (UID: \"6d66829f-7076-4ef3-8812-3b37496abd89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.544142 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.554507 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.554818 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssvj\" (UniqueName: \"kubernetes.io/projected/011bab8f-7841-4e99-8d47-ee7ed71b9ec5-kube-api-access-kssvj\") pod \"control-plane-machine-set-operator-78cbb6b69f-xqnhs\" (UID: \"011bab8f-7841-4e99-8d47-ee7ed71b9ec5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.573509 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.575741 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.576150 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.076132172 +0000 UTC m=+143.229278310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.587174 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/641ade1d-9c8e-4b46-b599-32b1165e7528-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gpzgc\" (UID: \"641ade1d-9c8e-4b46-b599-32b1165e7528\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.588712 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.589312 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-bound-sa-token\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.589610 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.612634 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc56p\" (UniqueName: \"kubernetes.io/projected/aae73693-2f17-4d81-9e1e-f510035bd84f-kube-api-access-lc56p\") pod \"downloads-7954f5f757-25hvk\" (UID: \"aae73693-2f17-4d81-9e1e-f510035bd84f\") " pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.614064 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xvvtx"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.625811 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.629357 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqspm\" (UniqueName: \"kubernetes.io/projected/ee1bf6ec-599d-4a04-b05b-32db37d474b8-kube-api-access-jqspm\") pod \"console-operator-58897d9998-4rjd7\" (UID: \"ee1bf6ec-599d-4a04-b05b-32db37d474b8\") " pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: W0130 21:18:11.632188 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb085cfec_e382_48c5_a623_679412c5b97e.slice/crio-aa5b7a21d1529f620ff06b51a6fbbdbefb35a469a6adb3f77d0e67f17dd01aac WatchSource:0}: Error finding container aa5b7a21d1529f620ff06b51a6fbbdbefb35a469a6adb3f77d0e67f17dd01aac: Status 404 returned error can't find the container with id aa5b7a21d1529f620ff06b51a6fbbdbefb35a469a6adb3f77d0e67f17dd01aac Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.638464 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.649552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cfx5\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-kube-api-access-2cfx5\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.664574 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.669009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g99h\" (UniqueName: \"kubernetes.io/projected/2f3dca3c-cddc-472e-8d57-ac479e9adcc6-kube-api-access-2g99h\") pod \"kube-storage-version-migrator-operator-b67b599dd-4vdhp\" (UID: \"2f3dca3c-cddc-472e-8d57-ac479e9adcc6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.679246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.679834 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.179819791 +0000 UTC m=+143.332965929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.686428 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.698268 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxgxz\" (UniqueName: \"kubernetes.io/projected/151c367e-d6c2-4433-a401-2b6390b4ce09-kube-api-access-jxgxz\") pod \"service-ca-operator-777779d784-pq6dw\" (UID: \"151c367e-d6c2-4433-a401-2b6390b4ce09\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.718343 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55022479-2799-4d9b-b7a2-b2aa98aad754-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-n92wx\" (UID: \"55022479-2799-4d9b-b7a2-b2aa98aad754\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.726919 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.741051 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqjxw\" (UniqueName: \"kubernetes.io/projected/207681cc-7b97-4401-89ce-2fa68270a9be-kube-api-access-vqjxw\") pod \"service-ca-9c57cc56f-6n9gz\" (UID: \"207681cc-7b97-4401-89ce-2fa68270a9be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.754113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcp4\" (UniqueName: \"kubernetes.io/projected/dd752012-3171-434a-a470-ed59fe9a382d-kube-api-access-6qcp4\") pod \"openshift-config-operator-7777fb866f-9x6k8\" (UID: \"dd752012-3171-434a-a470-ed59fe9a382d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.777001 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkw4w\" (UniqueName: \"kubernetes.io/projected/6362617b-267a-4763-835f-b77935ceec53-kube-api-access-bkw4w\") pod \"authentication-operator-69f744f599-hpqw6\" (UID: \"6362617b-267a-4763-835f-b77935ceec53\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.782648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.783863 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.28383744 +0000 UTC m=+143.436983578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.785904 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.795480 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzlq\" (UniqueName: \"kubernetes.io/projected/7bb3210b-e06b-45f5-9df7-2a6b8e322223-kube-api-access-hnzlq\") pod \"dns-operator-744455d44c-vpnmb\" (UID: \"7bb3210b-e06b-45f5-9df7-2a6b8e322223\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.796276 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.802134 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.809656 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.838741 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.842161 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2fp\" (UniqueName: \"kubernetes.io/projected/96b8d762-5d1f-4632-9685-681294725b38-kube-api-access-7h2fp\") pod \"etcd-operator-b45778765-88rvm\" (UID: \"96b8d762-5d1f-4632-9685-681294725b38\") " pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.854944 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8b9q\" (UniqueName: \"kubernetes.io/projected/42c52fd3-a2bb-4345-b047-592d8525bf76-kube-api-access-t8b9q\") pod \"migrator-59844c95c7-7x57c\" (UID: \"42c52fd3-a2bb-4345-b047-592d8525bf76\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.859100 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.876232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpf6s\" (UniqueName: \"kubernetes.io/projected/12d3ad25-9e32-4467-8a93-43dfee213499-kube-api-access-cpf6s\") pod \"multus-admission-controller-857f4d67dd-4ngsh\" (UID: \"12d3ad25-9e32-4467-8a93-43dfee213499\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.882658 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.884653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.885081 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.385065767 +0000 UTC m=+143.538211905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.904224 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnn5m\" (UniqueName: \"kubernetes.io/projected/573e1765-0165-4d38-840b-04b29206758f-kube-api-access-nnn5m\") pod \"machine-config-operator-74547568cd-p2d2l\" (UID: \"573e1765-0165-4d38-840b-04b29206758f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.913787 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.920467 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4k6d4"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.930845 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc"] Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.931990 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb7c1a69-33bc-4cbe-933f-905775505373-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gr5pq\" (UID: \"eb7c1a69-33bc-4cbe-933f-905775505373\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.934184 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlkqd\" (UniqueName: \"kubernetes.io/projected/57726227-3cf1-4553-9b60-63e2082c887d-kube-api-access-xlkqd\") pod \"router-default-5444994796-vmqm2\" (UID: \"57726227-3cf1-4553-9b60-63e2082c887d\") " pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.949036 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" Jan 30 21:18:11 crc kubenswrapper[4834]: W0130 21:18:11.957219 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59b3b974_9ba8_426b_8836_34ecbb56f86f.slice/crio-8250b2fab1f8cf7e7e7e4a048289f9e49c58030af2172fb4bea37699126ce415 WatchSource:0}: Error finding container 8250b2fab1f8cf7e7e7e4a048289f9e49c58030af2172fb4bea37699126ce415: Status 404 returned error can't find the container with id 8250b2fab1f8cf7e7e7e4a048289f9e49c58030af2172fb4bea37699126ce415 Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.957330 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nh9k\" (UniqueName: \"kubernetes.io/projected/c568ee8a-e57b-46b4-9852-277e61360e02-kube-api-access-5nh9k\") pod \"machine-config-server-cxssc\" (UID: \"c568ee8a-e57b-46b4-9852-277e61360e02\") " pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.979962 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.980848 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djl5x\" (UniqueName: \"kubernetes.io/projected/aab9a42d-c833-46b2-a745-1bb95ada7f68-kube-api-access-djl5x\") pod \"marketplace-operator-79b997595-9kdrb\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.985203 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:11 crc kubenswrapper[4834]: E0130 21:18:11.985572 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.485552562 +0000 UTC m=+143.638698710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.995888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrwd\" (UniqueName: \"kubernetes.io/projected/bf7807b8-0393-4e17-ad63-e598aa25593e-kube-api-access-wwrwd\") pod \"package-server-manager-789f6589d5-fq6x5\" (UID: \"bf7807b8-0393-4e17-ad63-e598aa25593e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:11 crc kubenswrapper[4834]: I0130 21:18:11.996899 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.008086 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.012038 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4ws\" (UniqueName: \"kubernetes.io/projected/28198978-fb02-4208-9967-2c6ac9258439-kube-api-access-nz4ws\") pod \"packageserver-d55dfcdfc-rtdcj\" (UID: \"28198978-fb02-4208-9967-2c6ac9258439\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.038032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.039693 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-757sd\" (UniqueName: \"kubernetes.io/projected/15134023-081c-4cdd-bf87-dc07d86f3dfd-kube-api-access-757sd\") pod \"catalog-operator-68c6474976-lmpvg\" (UID: \"15134023-081c-4cdd-bf87-dc07d86f3dfd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.051218 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cxssc" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.057875 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bd7\" (UniqueName: \"kubernetes.io/projected/04d66c5d-67d0-49dd-8777-44d9bc45090b-kube-api-access-49bd7\") pod \"machine-config-controller-84d6567774-9wj8w\" (UID: \"04d66c5d-67d0-49dd-8777-44d9bc45090b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.070263 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.079489 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwx4v\" (UniqueName: \"kubernetes.io/projected/03fad3ad-d98f-420e-be43-8081c20dd6d4-kube-api-access-gwx4v\") pod \"csi-hostpathplugin-gzhfr\" (UID: \"03fad3ad-d98f-420e-be43-8081c20dd6d4\") " pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.084080 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.090813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.091125 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.591113515 +0000 UTC m=+143.744259653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.094150 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nb2\" (UniqueName: \"kubernetes.io/projected/99b4691a-d300-46e1-9ca1-ea8287465be8-kube-api-access-w9nb2\") pod \"olm-operator-6b444d44fb-cmds5\" (UID: \"99b4691a-d300-46e1-9ca1-ea8287465be8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.095560 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.113987 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.126963 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.136054 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjs5\" (UniqueName: \"kubernetes.io/projected/00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a-kube-api-access-prjs5\") pod \"dns-default-9c6nm\" (UID: \"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a\") " pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.136433 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.136538 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8z96\" (UniqueName: \"kubernetes.io/projected/0e640e7b-4b48-456a-a65a-31c1e2047222-kube-api-access-h8z96\") pod \"ingress-canary-5cbv7\" (UID: \"0e640e7b-4b48-456a-a65a-31c1e2047222\") " pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.147440 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.157955 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.169942 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5cbv7" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.170447 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.179186 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.192792 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.193157 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.693134215 +0000 UTC m=+143.846280353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.197907 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.227318 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.256062 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.260703 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.309588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.310945 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.810900348 +0000 UTC m=+143.964046486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.346289 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.353034 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc"] Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.418282 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d66829f_7076_4ef3_8812_3b37496abd89.slice/crio-c810f6523ad39cee206e3c739402d0377018aba064eb6d26c1c63c48ee498a57 WatchSource:0}: Error finding container c810f6523ad39cee206e3c739402d0377018aba064eb6d26c1c63c48ee498a57: Status 404 returned error can't find the container with id c810f6523ad39cee206e3c739402d0377018aba064eb6d26c1c63c48ee498a57 Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.422303 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.422939 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:12.922914572 +0000 UTC m=+144.076060710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.431168 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57726227_3cf1_4553_9b60_63e2082c887d.slice/crio-03c0decdea95afa00ca3a329c9f582a7ce8c43d66f2e14d562c4537f5a878c69 WatchSource:0}: Error finding container 03c0decdea95afa00ca3a329c9f582a7ce8c43d66f2e14d562c4537f5a878c69: Status 404 returned error can't find the container with id 03c0decdea95afa00ca3a329c9f582a7ce8c43d66f2e14d562c4537f5a878c69 Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.452723 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4rjd7"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.471220 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.477316 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hpqw6"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.512234 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-25hvk"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.524568 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.525244 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.025228601 +0000 UTC m=+144.178374739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.572509 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.578762 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" event={"ID":"641ade1d-9c8e-4b46-b599-32b1165e7528","Type":"ContainerStarted","Data":"574fdea46c300315529915d09fb5d0818b82058d52ee7fad5f0877eec248cbf1"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.584074 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" event={"ID":"cbcfc442-8f5f-4da5-9289-98484d7c0cb3","Type":"ContainerStarted","Data":"dd14131991fab36c7a80e026d3a3ed984480315b6a52921728c6940b5e8b6bf9"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.587125 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" event={"ID":"2f3dca3c-cddc-472e-8d57-ac479e9adcc6","Type":"ContainerStarted","Data":"1102b583cc9f9a0178390c72561f5f64094314af5fea6c9f9c02c1c3683700ee"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.588027 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vmqm2" event={"ID":"57726227-3cf1-4553-9b60-63e2082c887d","Type":"ContainerStarted","Data":"03c0decdea95afa00ca3a329c9f582a7ce8c43d66f2e14d562c4537f5a878c69"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.590582 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" event={"ID":"00580dd0-6712-4cd5-b651-a200271e0727","Type":"ContainerStarted","Data":"9eb9741d9cfae0040154d1ede23cb520d755e3b6a93ecd23266c40a386442a43"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.590612 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" event={"ID":"00580dd0-6712-4cd5-b651-a200271e0727","Type":"ContainerStarted","Data":"9990a8a5bb2926443c77150426fe9ea360bc4d7b756267e81eb322944ba111a5"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.592169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" event={"ID":"1ae1a469-a830-4aef-bb3b-eb786efa1078","Type":"ContainerStarted","Data":"a06b99c2252212f9b56ba22693d010852568faa70ccf7844ce7249535cd3c7e3"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.598862 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" event={"ID":"e4abe8be-aa7d-46ad-a658-259955e42044","Type":"ContainerStarted","Data":"5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.601016 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.613165 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.619624 4834 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zrxr5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.619718 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" podUID="e4abe8be-aa7d-46ad-a658-259955e42044" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.620216 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" event={"ID":"011bab8f-7841-4e99-8d47-ee7ed71b9ec5","Type":"ContainerStarted","Data":"cec2cf2ff741c7a96f592f4e2d128730f62febfe78c170bc08ba0090cf20c614"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.622110 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" event={"ID":"f3246e84-3def-488f-8a8f-069bdc3fa563","Type":"ContainerStarted","Data":"028c706b318ce0de3464448e638c0baed6b4615d2a86188be51942ad871faaf3"} Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.626420 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd752012_3171_434a_a470_ed59fe9a382d.slice/crio-4a7bf6517ccac36c4b88af90fa6daeffce6c6ac6a22dcfd62fe82912f72acad5 WatchSource:0}: Error finding container 4a7bf6517ccac36c4b88af90fa6daeffce6c6ac6a22dcfd62fe82912f72acad5: Status 404 returned error can't find the container with id 4a7bf6517ccac36c4b88af90fa6daeffce6c6ac6a22dcfd62fe82912f72acad5 Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.627001 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.627499 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.127451397 +0000 UTC m=+144.280597535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.627772 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.628432 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.128388995 +0000 UTC m=+144.281535133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.632108 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" event={"ID":"b085cfec-e382-48c5-a623-679412c5b97e","Type":"ContainerStarted","Data":"4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.632162 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" event={"ID":"b085cfec-e382-48c5-a623-679412c5b97e","Type":"ContainerStarted","Data":"aa5b7a21d1529f620ff06b51a6fbbdbefb35a469a6adb3f77d0e67f17dd01aac"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.632306 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.642570 4834 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gndbg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.642641 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" podUID="b085cfec-e382-48c5-a623-679412c5b97e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.656208 4834 generic.go:334] "Generic (PLEG): container finished" podID="baf610cb-71c9-4589-8aa1-74bf3030485b" containerID="687abbc07d21e709d2162381c5d334b7857117dcbbc82e8e7c3895683fe8a4ae" exitCode=0 Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.657413 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" event={"ID":"baf610cb-71c9-4589-8aa1-74bf3030485b","Type":"ContainerDied","Data":"687abbc07d21e709d2162381c5d334b7857117dcbbc82e8e7c3895683fe8a4ae"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.660477 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" event={"ID":"6d66829f-7076-4ef3-8812-3b37496abd89","Type":"ContainerStarted","Data":"c810f6523ad39cee206e3c739402d0377018aba064eb6d26c1c63c48ee498a57"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.677185 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" event={"ID":"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68","Type":"ContainerStarted","Data":"596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.677243 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" event={"ID":"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68","Type":"ContainerStarted","Data":"977ed8e7fcff9be07983124dcb4830f2e76bf2ea12a4603f9bebdd0980cba2a2"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.677884 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.687838 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4k6d4" event={"ID":"59b3b974-9ba8-426b-8836-34ecbb56f86f","Type":"ContainerStarted","Data":"8250b2fab1f8cf7e7e7e4a048289f9e49c58030af2172fb4bea37699126ce415"} Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.691638 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6362617b_267a_4763_835f_b77935ceec53.slice/crio-3a5124902828db22b8a50b1a9704d0b8bcafeaf502de55131cfd29aa0fb39d13 WatchSource:0}: Error finding container 3a5124902828db22b8a50b1a9704d0b8bcafeaf502de55131cfd29aa0fb39d13: Status 404 returned error can't find the container with id 3a5124902828db22b8a50b1a9704d0b8bcafeaf502de55131cfd29aa0fb39d13 Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.692953 4834 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xvvtx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.693017 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.698436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" event={"ID":"e8a4b745-1818-4728-b0f8-e1e8eed70e6b","Type":"ContainerStarted","Data":"28cfaaeb0a4076b6d6675aa0990e1972abf6ef455d57e49809edf881fc641c65"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.698498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" event={"ID":"e8a4b745-1818-4728-b0f8-e1e8eed70e6b","Type":"ContainerStarted","Data":"daaf92d6943560f87b6dfb7bf1477128f13bcb51f2c1ad75a858e90ad9178115"} Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.700942 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" event={"ID":"1595dff8-9195-4ab2-83ed-5ccd825b21eb","Type":"ContainerStarted","Data":"42e350ad0a459aff34ec7f4d9bf85d0787ea88ffecccee78aa2c13773bf6e771"} Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.715329 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee1bf6ec_599d_4a04_b05b_32db37d474b8.slice/crio-3ca09dd69cda9e90ccb854479203ca7c597277b4b297ea47d0d078cbe1dc9cf3 WatchSource:0}: Error finding container 3ca09dd69cda9e90ccb854479203ca7c597277b4b297ea47d0d078cbe1dc9cf3: Status 404 returned error can't find the container with id 3ca09dd69cda9e90ccb854479203ca7c597277b4b297ea47d0d078cbe1dc9cf3 Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.729062 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.731833 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.231783995 +0000 UTC m=+144.384930133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.796704 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.806272 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6n9gz"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.833996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.836406 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.336374931 +0000 UTC m=+144.489521069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.873902 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vpnmb"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.877618 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.889619 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-88rvm"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.893874 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.909776 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq"] Jan 30 21:18:12 crc kubenswrapper[4834]: I0130 21:18:12.936204 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.936276 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c52fd3_a2bb_4345_b047_592d8525bf76.slice/crio-620d4aab817d5833a0fed913b6a21403337e601a39f8a32464b523df09bb2e52 WatchSource:0}: Error finding container 620d4aab817d5833a0fed913b6a21403337e601a39f8a32464b523df09bb2e52: Status 404 returned error can't find the container with id 620d4aab817d5833a0fed913b6a21403337e601a39f8a32464b523df09bb2e52 Jan 30 21:18:12 crc kubenswrapper[4834]: E0130 21:18:12.936583 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.436565427 +0000 UTC m=+144.589711565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:12 crc kubenswrapper[4834]: W0130 21:18:12.995513 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15134023_081c_4cdd_bf87_dc07d86f3dfd.slice/crio-19179d248226e58146f8db1cf3b4ca506cac1ea6d0268d84299321332a6a17a5 WatchSource:0}: Error finding container 19179d248226e58146f8db1cf3b4ca506cac1ea6d0268d84299321332a6a17a5: Status 404 returned error can't find the container with id 19179d248226e58146f8db1cf3b4ca506cac1ea6d0268d84299321332a6a17a5 Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.039447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.040133 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.540119933 +0000 UTC m=+144.693266071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.046213 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.141140 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.142001 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.641983128 +0000 UTC m=+144.795129266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.174826 4834 csr.go:261] certificate signing request csr-crpbr is approved, waiting to be issued Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.186083 4834 csr.go:257] certificate signing request csr-crpbr is issued Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.201632 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.204453 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.252053 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.252832 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.752812937 +0000 UTC m=+144.905959075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.252454 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.252904 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gzhfr"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.313328 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9kdrb"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.325032 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.356758 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.362513 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.862490173 +0000 UTC m=+145.015636311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.380364 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.381439 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.88142541 +0000 UTC m=+145.034571548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.451174 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9c6nm"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.488415 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.489189 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:13.989176368 +0000 UTC m=+145.142322506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.572153 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4ngsh"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.591172 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.591594 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.09158319 +0000 UTC m=+145.244729318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: W0130 21:18:13.617719 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573e1765_0165_4d38_840b_04b29206758f.slice/crio-753c5939f4dff936c0d332e704e53a6422763775214eef019ec770bb261cd307 WatchSource:0}: Error finding container 753c5939f4dff936c0d332e704e53a6422763775214eef019ec770bb261cd307: Status 404 returned error can't find the container with id 753c5939f4dff936c0d332e704e53a6422763775214eef019ec770bb261cd307 Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.627626 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5cbv7"] Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.663963 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-brd2r" podStartSLOduration=124.663946928 podStartE2EDuration="2m4.663946928s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:13.661583278 +0000 UTC m=+144.814729416" watchObservedRunningTime="2026-01-30 21:18:13.663946928 +0000 UTC m=+144.817093066" Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.693987 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.694320 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.194307121 +0000 UTC m=+145.347453259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.725455 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" event={"ID":"15134023-081c-4cdd-bf87-dc07d86f3dfd","Type":"ContainerStarted","Data":"19179d248226e58146f8db1cf3b4ca506cac1ea6d0268d84299321332a6a17a5"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.737717 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" event={"ID":"03fad3ad-d98f-420e-be43-8081c20dd6d4","Type":"ContainerStarted","Data":"0d0c66cfabe0b9e667614a0b80cf94888b41138c85655b79b5bef1b779c0ae00"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.762517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" event={"ID":"011bab8f-7841-4e99-8d47-ee7ed71b9ec5","Type":"ContainerStarted","Data":"908af42caf90f170b13d139ae623dd616e2b6b9207e8091d68224a751012ef73"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.771317 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" event={"ID":"eb7c1a69-33bc-4cbe-933f-905775505373","Type":"ContainerStarted","Data":"9d6f4d815c6e3ce08d4cc8b581220842ed266548880cb4745fcc5b6058252fca"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.790939 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" event={"ID":"207681cc-7b97-4401-89ce-2fa68270a9be","Type":"ContainerStarted","Data":"87dd792b0b3858c2dc861eb69ed91338283c33cb321048afa5f0d4e0f03f2c05"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.795098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.795650 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.295639451 +0000 UTC m=+145.448785579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.806513 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qb6gd" podStartSLOduration=124.8064959 podStartE2EDuration="2m4.8064959s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:13.804908673 +0000 UTC m=+144.958054811" watchObservedRunningTime="2026-01-30 21:18:13.8064959 +0000 UTC m=+144.959642038" Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.820240 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" event={"ID":"2f3dca3c-cddc-472e-8d57-ac479e9adcc6","Type":"ContainerStarted","Data":"88b92ba8a5ef8e7201abb7dd9c0a3d1403399bf51176ef3bfb6a3dbbd32640a4"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.823272 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" event={"ID":"7bb3210b-e06b-45f5-9df7-2a6b8e322223","Type":"ContainerStarted","Data":"c62dd3ae69d871c61fc60eddab9b26709cf8e24b836e61985cf95122b31b6d52"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.824905 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" event={"ID":"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79","Type":"ContainerStarted","Data":"d4c3d0cb7509d1ba745695b1a5d38a208b75c036833e605e4d7018f58e0fda56"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.826014 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" event={"ID":"1ae1a469-a830-4aef-bb3b-eb786efa1078","Type":"ContainerStarted","Data":"626736a6aa2b600523a8a59a7ec63f8e75553a71b0452c123aef3fae640d276d"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.827474 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" event={"ID":"12d3ad25-9e32-4467-8a93-43dfee213499","Type":"ContainerStarted","Data":"64e3366e358cb32ae5652ef5e71d95808af839b095e5ddd8a4dac4f143102467"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.828767 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" event={"ID":"dd752012-3171-434a-a470-ed59fe9a382d","Type":"ContainerStarted","Data":"37754c9d77d8cdbc08261822be281a4148fdddc734f6c749f35ce8f463c7a1de"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.828795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" event={"ID":"dd752012-3171-434a-a470-ed59fe9a382d","Type":"ContainerStarted","Data":"4a7bf6517ccac36c4b88af90fa6daeffce6c6ac6a22dcfd62fe82912f72acad5"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.830176 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" event={"ID":"42c52fd3-a2bb-4345-b047-592d8525bf76","Type":"ContainerStarted","Data":"620d4aab817d5833a0fed913b6a21403337e601a39f8a32464b523df09bb2e52"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.845066 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" event={"ID":"641ade1d-9c8e-4b46-b599-32b1165e7528","Type":"ContainerStarted","Data":"cf2f37a3ceaa45ada84c5bd90c3b0e431a73a68351d55244e92f48926e48fa49"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.860307 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" podStartSLOduration=123.860290152 podStartE2EDuration="2m3.860290152s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:13.858739866 +0000 UTC m=+145.011886004" watchObservedRunningTime="2026-01-30 21:18:13.860290152 +0000 UTC m=+145.013436290" Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.875434 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4k6d4" event={"ID":"59b3b974-9ba8-426b-8836-34ecbb56f86f","Type":"ContainerStarted","Data":"9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.876468 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" event={"ID":"aab9a42d-c833-46b2-a745-1bb95ada7f68","Type":"ContainerStarted","Data":"11160ce93d171339a3eaddddc1aecfd0c05723be0c4eb057f68a59b99220136d"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.881813 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" event={"ID":"573e1765-0165-4d38-840b-04b29206758f","Type":"ContainerStarted","Data":"753c5939f4dff936c0d332e704e53a6422763775214eef019ec770bb261cd307"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.903287 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.908225 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" event={"ID":"28198978-fb02-4208-9967-2c6ac9258439","Type":"ContainerStarted","Data":"87e773fd406471b0041c5533ccffd209966e90d95bc3e2ee0eab1ca6e97f32b0"} Jan 30 21:18:13 crc kubenswrapper[4834]: E0130 21:18:13.910719 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.410698514 +0000 UTC m=+145.563844652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.960412 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-vmqm2" event={"ID":"57726227-3cf1-4553-9b60-63e2082c887d","Type":"ContainerStarted","Data":"c6549ce3546e472c30acb06b191e101167d2e8f716bc6d73b521849ee49e6de0"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.966686 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9c6nm" event={"ID":"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a","Type":"ContainerStarted","Data":"58c800f191fc93434efb8869203ff9f9987968a2e98347aa64cdbb4f48e4c016"} Jan 30 21:18:13 crc kubenswrapper[4834]: I0130 21:18:13.969890 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" podStartSLOduration=124.969879714 podStartE2EDuration="2m4.969879714s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:13.913643281 +0000 UTC m=+145.066789419" watchObservedRunningTime="2026-01-30 21:18:13.969879714 +0000 UTC m=+145.123025852" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.003930 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5qhdb" podStartSLOduration=124.003915975 podStartE2EDuration="2m4.003915975s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.002962457 +0000 UTC m=+145.156108595" watchObservedRunningTime="2026-01-30 21:18:14.003915975 +0000 UTC m=+145.157062113" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.008428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.014665 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.514645551 +0000 UTC m=+145.667791689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.021519 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cxssc" event={"ID":"c568ee8a-e57b-46b4-9852-277e61360e02","Type":"ContainerStarted","Data":"3160fe7cb24f855378f42c26ee769490e0bb008a101c0427b5de6f2e37487a78"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.021579 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cxssc" event={"ID":"c568ee8a-e57b-46b4-9852-277e61360e02","Type":"ContainerStarted","Data":"938b36de807e639a09a68ef29e47630d8fa00207e8e29b09435f4949bcc7da2b"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.022375 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" event={"ID":"ee1bf6ec-599d-4a04-b05b-32db37d474b8","Type":"ContainerStarted","Data":"3ca09dd69cda9e90ccb854479203ca7c597277b4b297ea47d0d078cbe1dc9cf3"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.023290 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" event={"ID":"55022479-2799-4d9b-b7a2-b2aa98aad754","Type":"ContainerStarted","Data":"7ce9c263a34977d631f8ad14b6359b9bcab8d0e25969517bc72a55c2642566d7"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.023311 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" event={"ID":"55022479-2799-4d9b-b7a2-b2aa98aad754","Type":"ContainerStarted","Data":"79ce97cfa31ad5d983f62a2e254b4d7ab6f45e46392a356d363c544e10550971"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.029413 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" event={"ID":"bf7807b8-0393-4e17-ad63-e598aa25593e","Type":"ContainerStarted","Data":"3dc6a363cbe7478ef9248dd9fdc3c24a115f6e5ed52485be7ce8bbbbf25dbfc0"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.030238 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" podStartSLOduration=125.030222729 podStartE2EDuration="2m5.030222729s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.029701644 +0000 UTC m=+145.182847782" watchObservedRunningTime="2026-01-30 21:18:14.030222729 +0000 UTC m=+145.183368867" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.049351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" event={"ID":"99b4691a-d300-46e1-9ca1-ea8287465be8","Type":"ContainerStarted","Data":"4d20077a857dc29a265ac9b71c6ff152f78f6594e81ffc4e2de04af78b3cca67"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.058028 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" event={"ID":"96b8d762-5d1f-4632-9685-681294725b38","Type":"ContainerStarted","Data":"d43ee1f900638eb2dba41dc14d45260f06570c436a03183e523072ff18a16b6b"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.073529 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.074446 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.074473 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.120794 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.121896 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.621881984 +0000 UTC m=+145.775028122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.127139 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n7h7d" podStartSLOduration=125.127121388 podStartE2EDuration="2m5.127121388s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.066262019 +0000 UTC m=+145.219408157" watchObservedRunningTime="2026-01-30 21:18:14.127121388 +0000 UTC m=+145.280267526" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.151980 4834 generic.go:334] "Generic (PLEG): container finished" podID="cbcfc442-8f5f-4da5-9289-98484d7c0cb3" containerID="cf577f45e48d9d0d2ac6ea063bf26dcae7c79b02707040292ffe419232beb001" exitCode=0 Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.152075 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" event={"ID":"cbcfc442-8f5f-4da5-9289-98484d7c0cb3","Type":"ContainerDied","Data":"cf577f45e48d9d0d2ac6ea063bf26dcae7c79b02707040292ffe419232beb001"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.161416 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-n92wx" podStartSLOduration=125.161386456 podStartE2EDuration="2m5.161386456s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.12851596 +0000 UTC m=+145.281662098" watchObservedRunningTime="2026-01-30 21:18:14.161386456 +0000 UTC m=+145.314532614" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.162757 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-vmqm2" podStartSLOduration=124.162749246 podStartE2EDuration="2m4.162749246s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.160614494 +0000 UTC m=+145.313760632" watchObservedRunningTime="2026-01-30 21:18:14.162749246 +0000 UTC m=+145.315895384" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.169568 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" event={"ID":"04d66c5d-67d0-49dd-8777-44d9bc45090b","Type":"ContainerStarted","Data":"9e8b4fda5405972bf96239b4e61d47a530c92b291a3647508e58890da9b6a38b"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.188163 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-30 21:13:13 +0000 UTC, rotation deadline is 2026-12-02 11:59:23.033884594 +0000 UTC Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.188247 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7334h41m8.845639518s for next certificate rotation Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.192752 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-25hvk" event={"ID":"aae73693-2f17-4d81-9e1e-f510035bd84f","Type":"ContainerStarted","Data":"ac4775198bfd7f0cf7654b4ceaa80a4e1c0421fed39a9844793207f26f84af24"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.228873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.229491 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.729474609 +0000 UTC m=+145.882620747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.261348 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gpzgc" podStartSLOduration=124.261327905 podStartE2EDuration="2m4.261327905s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.259191602 +0000 UTC m=+145.412337740" watchObservedRunningTime="2026-01-30 21:18:14.261327905 +0000 UTC m=+145.414474043" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.264107 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4vdhp" podStartSLOduration=124.264095127 podStartE2EDuration="2m4.264095127s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.219424903 +0000 UTC m=+145.372571041" watchObservedRunningTime="2026-01-30 21:18:14.264095127 +0000 UTC m=+145.417241265" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.292004 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cxssc" podStartSLOduration=6.291988707 podStartE2EDuration="6.291988707s" podCreationTimestamp="2026-01-30 21:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.290967667 +0000 UTC m=+145.444113805" watchObservedRunningTime="2026-01-30 21:18:14.291988707 +0000 UTC m=+145.445134835" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.310992 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" event={"ID":"f3246e84-3def-488f-8a8f-069bdc3fa563","Type":"ContainerStarted","Data":"02840464958cec4c67536767b1b7a8fa47afb3116df59f556c51d1b87308f474"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.318336 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" event={"ID":"151c367e-d6c2-4433-a401-2b6390b4ce09","Type":"ContainerStarted","Data":"8047f41cfe4c0ba5abea55d5de26ae62de7f3aa9c23b3ab3ae4ff1efbdf8b64f"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.327200 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" event={"ID":"6d66829f-7076-4ef3-8812-3b37496abd89","Type":"ContainerStarted","Data":"dde754dac52a36d1a31ad0bfc4130164e6f52ad60eabb42a0ad0bd3a9bdc083b"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.334584 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.334907 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.834889428 +0000 UTC m=+145.988035566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.367387 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" event={"ID":"6362617b-267a-4763-835f-b77935ceec53","Type":"ContainerStarted","Data":"146bafbcbe7b2c32f224754af41fbecd9d9aab3f7991c948abfba9731449ec6e"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.367445 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" event={"ID":"6362617b-267a-4763-835f-b77935ceec53","Type":"ContainerStarted","Data":"3a5124902828db22b8a50b1a9704d0b8bcafeaf502de55131cfd29aa0fb39d13"} Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.380301 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xqnhs" podStartSLOduration=124.380275443 podStartE2EDuration="2m4.380275443s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.379772298 +0000 UTC m=+145.532918436" watchObservedRunningTime="2026-01-30 21:18:14.380275443 +0000 UTC m=+145.533421581" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.380553 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.380776 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.381832 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4k6d4" podStartSLOduration=125.381825939 podStartE2EDuration="2m5.381825939s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.334461986 +0000 UTC m=+145.487608124" watchObservedRunningTime="2026-01-30 21:18:14.381825939 +0000 UTC m=+145.534972077" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.443101 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.443913 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:14.943901194 +0000 UTC m=+146.097047332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.459928 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" podStartSLOduration=125.459908945 podStartE2EDuration="2m5.459908945s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.419844117 +0000 UTC m=+145.572990255" watchObservedRunningTime="2026-01-30 21:18:14.459908945 +0000 UTC m=+145.613055083" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.490955 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hpqw6" podStartSLOduration=125.490935617 podStartE2EDuration="2m5.490935617s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:14.487889028 +0000 UTC m=+145.641035166" watchObservedRunningTime="2026-01-30 21:18:14.490935617 +0000 UTC m=+145.644081745" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.544857 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.547424 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.047385467 +0000 UTC m=+146.200531605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.653120 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.653764 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.153749295 +0000 UTC m=+146.306895433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.732166 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.754613 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.754951 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.254935821 +0000 UTC m=+146.408081959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.855721 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.856024 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.356013913 +0000 UTC m=+146.509160051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:14 crc kubenswrapper[4834]: I0130 21:18:14.961058 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:14 crc kubenswrapper[4834]: E0130 21:18:14.961444 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.461425443 +0000 UTC m=+146.614571591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.063590 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.064407 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.56435617 +0000 UTC m=+146.717502308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.078277 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:15 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:15 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:15 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.078320 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.166484 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.166727 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.66670482 +0000 UTC m=+146.819850958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.166853 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.167200 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.667176474 +0000 UTC m=+146.820322612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.268496 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.268829 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.768812753 +0000 UTC m=+146.921958891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.269081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.269423 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.76941415 +0000 UTC m=+146.922560278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.372119 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.372553 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.872536053 +0000 UTC m=+147.025682191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.404451 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd752012-3171-434a-a470-ed59fe9a382d" containerID="37754c9d77d8cdbc08261822be281a4148fdddc734f6c749f35ce8f463c7a1de" exitCode=0 Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.404596 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" event={"ID":"dd752012-3171-434a-a470-ed59fe9a382d","Type":"ContainerDied","Data":"37754c9d77d8cdbc08261822be281a4148fdddc734f6c749f35ce8f463c7a1de"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.404652 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" event={"ID":"dd752012-3171-434a-a470-ed59fe9a382d","Type":"ContainerStarted","Data":"2d7c6c912db82dac816759548ae7a511aedbb61cac069f29e158847acdb0bfce"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.405466 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.415865 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" event={"ID":"baf610cb-71c9-4589-8aa1-74bf3030485b","Type":"ContainerStarted","Data":"6ef6e59bb8f7cee5f0173cb65bb2118c90f50d570a0fa691658873d4c75364b9"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.415915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" event={"ID":"baf610cb-71c9-4589-8aa1-74bf3030485b","Type":"ContainerStarted","Data":"165b9460482584a7c3036422e11ad770870472a9bc6e440a95bc5c7bb8548070"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.428706 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" podStartSLOduration=126.428690104 podStartE2EDuration="2m6.428690104s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.427686745 +0000 UTC m=+146.580832883" watchObservedRunningTime="2026-01-30 21:18:15.428690104 +0000 UTC m=+146.581836242" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.428759 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" event={"ID":"ee1bf6ec-599d-4a04-b05b-32db37d474b8","Type":"ContainerStarted","Data":"cd82cf9affc8901e998219b4ffb1059ef30e8f42d42f53d4d52527e28c170c1e"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.429487 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.435752 4834 patch_prober.go:28] interesting pod/console-operator-58897d9998-4rjd7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.435811 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" podUID="ee1bf6ec-599d-4a04-b05b-32db37d474b8" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.444573 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" event={"ID":"573e1765-0165-4d38-840b-04b29206758f","Type":"ContainerStarted","Data":"186d016c786da6b28afa391de2b290eae9a7e588d33363727bf9c74089ad5900"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.445040 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" event={"ID":"573e1765-0165-4d38-840b-04b29206758f","Type":"ContainerStarted","Data":"c1ccd24ca31c095a7bc98cfe6c822328754c97d5c348663750430b2e3a065b56"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.454868 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" event={"ID":"cbcfc442-8f5f-4da5-9289-98484d7c0cb3","Type":"ContainerStarted","Data":"25d75d6ecfef0060a6e06511f5cd8a4e9a1fbc216084bd7ed96c87eabcfaffa0"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.464246 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" event={"ID":"12d3ad25-9e32-4467-8a93-43dfee213499","Type":"ContainerStarted","Data":"dd6c651c5574b790767870c2d8995a951eea925fe51d219dbfbadaf78ecfd848"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.473144 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" podStartSLOduration=126.473127261 podStartE2EDuration="2m6.473127261s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.47240555 +0000 UTC m=+146.625551688" watchObservedRunningTime="2026-01-30 21:18:15.473127261 +0000 UTC m=+146.626273399" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.473899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.475040 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:15.975027067 +0000 UTC m=+147.128173195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.485873 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" event={"ID":"04d66c5d-67d0-49dd-8777-44d9bc45090b","Type":"ContainerStarted","Data":"fad7d96b58c2ef8f2c10ef8e1a6071f6ed6ebdee0f5e1112188261edf7e5ce2f"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.485918 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" event={"ID":"04d66c5d-67d0-49dd-8777-44d9bc45090b","Type":"ContainerStarted","Data":"47b3bf2e88ea050c04e18e541d29566a3af871dd3c629da73e0fb9b89ba78554"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.498691 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p2d2l" podStartSLOduration=125.498670312 podStartE2EDuration="2m5.498670312s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.486543285 +0000 UTC m=+146.639689413" watchObservedRunningTime="2026-01-30 21:18:15.498670312 +0000 UTC m=+146.651816450" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.502460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-25hvk" event={"ID":"aae73693-2f17-4d81-9e1e-f510035bd84f","Type":"ContainerStarted","Data":"71f105cc89803ebe0b225f08644a0f2d3a321d849e5421a34eec2f4833ced804"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.503461 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.505963 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" event={"ID":"6d66829f-7076-4ef3-8812-3b37496abd89","Type":"ContainerStarted","Data":"cef269660e8720cdd179034c612612ae7ba9fa993438df060d34a92bd830f807"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.512765 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-25hvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.512821 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-25hvk" podUID="aae73693-2f17-4d81-9e1e-f510035bd84f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.514047 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" event={"ID":"207681cc-7b97-4401-89ce-2fa68270a9be","Type":"ContainerStarted","Data":"fc5e3d348800b2a5c3992ed071226f5c73ff959169ac71cc397af9bc3de7bf4e"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.544542 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" podStartSLOduration=126.544523591 podStartE2EDuration="2m6.544523591s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.512931021 +0000 UTC m=+146.666077159" watchObservedRunningTime="2026-01-30 21:18:15.544523591 +0000 UTC m=+146.697669729" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.550786 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5cbv7" event={"ID":"0e640e7b-4b48-456a-a65a-31c1e2047222","Type":"ContainerStarted","Data":"9ba0383811e5349dc9d401a4da2228c66d846b68353cf7e826a99f746a221d7b"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.550823 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5cbv7" event={"ID":"0e640e7b-4b48-456a-a65a-31c1e2047222","Type":"ContainerStarted","Data":"935e56f28307b0b2faa4fe53f49777871b3a3f08cd1cea7c4f71b13053ed4bba"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.550834 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" event={"ID":"1595dff8-9195-4ab2-83ed-5ccd825b21eb","Type":"ContainerStarted","Data":"54defce8885a665098572b8a6af7584c70a2aefe81846aebd28c52763e6a8ffe"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.551050 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" event={"ID":"7bb3210b-e06b-45f5-9df7-2a6b8e322223","Type":"ContainerStarted","Data":"b64e345ab06ef9f10bfb9e457461bc228ee99cd5a5b15ddbd2d137a7f3416b6a"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.551059 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" event={"ID":"7bb3210b-e06b-45f5-9df7-2a6b8e322223","Type":"ContainerStarted","Data":"93535b52ece48d9b5dd68686abdce6e41543b7f766672ee96ca1c061f8714dc8"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.552824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" event={"ID":"151c367e-d6c2-4433-a401-2b6390b4ce09","Type":"ContainerStarted","Data":"ae9073719969e17309a492a3cd272e00e5c018d1f76309d34c1c5f5dc0166c0a"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.564232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" event={"ID":"bf7807b8-0393-4e17-ad63-e598aa25593e","Type":"ContainerStarted","Data":"6bff81e35f94eb75af75c046bb40b6c0492d99da77ddd0849182da64d35fb5b9"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.564281 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" event={"ID":"bf7807b8-0393-4e17-ad63-e598aa25593e","Type":"ContainerStarted","Data":"12e45cc74fc8a7e3af6ee64ef954fb934ce194f5d5133c067edbe72947a02b70"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.564820 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.570530 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9c6nm" event={"ID":"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a","Type":"ContainerStarted","Data":"6fffdae10b02d68fe57e46b3bf8db0c3d097ca3893267227fe76f58e97491af8"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.571184 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.575880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" event={"ID":"28198978-fb02-4208-9967-2c6ac9258439","Type":"ContainerStarted","Data":"e7a1dd1b566fc3f2687dc49b18a89e5cf22fefd2771919a9049ccd7d303c38dd"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.576136 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.577031 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.578201 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.07817797 +0000 UTC m=+147.231324108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.578790 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" podStartSLOduration=125.578778578 podStartE2EDuration="2m5.578778578s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.545251482 +0000 UTC m=+146.698397620" watchObservedRunningTime="2026-01-30 21:18:15.578778578 +0000 UTC m=+146.731924716" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.579861 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9wj8w" podStartSLOduration=125.579853959 podStartE2EDuration="2m5.579853959s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.57886775 +0000 UTC m=+146.732013888" watchObservedRunningTime="2026-01-30 21:18:15.579853959 +0000 UTC m=+146.733000097" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.601325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" event={"ID":"aab9a42d-c833-46b2-a745-1bb95ada7f68","Type":"ContainerStarted","Data":"f1e3f597d5bb3d4210ce321e536af817f9bb98d856b761ab9d758e1a17d7fc75"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.602188 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.626201 4834 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rtdcj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.626266 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" podUID="28198978-fb02-4208-9967-2c6ac9258439" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.634774 4834 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9kdrb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.634835 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.636220 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" event={"ID":"96b8d762-5d1f-4632-9685-681294725b38","Type":"ContainerStarted","Data":"1f49610a99481ccf549138d4db2a395e51d86d4c98be45da297599c5e3c0c51e"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.654009 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" event={"ID":"eb7c1a69-33bc-4cbe-933f-905775505373","Type":"ContainerStarted","Data":"d541837b75381dd2677683672ba9c1e9e4fb1240f5ad513ec45ad990e1d04f2d"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.655795 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5cbv7" podStartSLOduration=7.655784151 podStartE2EDuration="7.655784151s" podCreationTimestamp="2026-01-30 21:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.644137629 +0000 UTC m=+146.797283767" watchObservedRunningTime="2026-01-30 21:18:15.655784151 +0000 UTC m=+146.808930289" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.656073 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bjbm4" podStartSLOduration=125.65606611 podStartE2EDuration="2m5.65606611s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.615150006 +0000 UTC m=+146.768296134" watchObservedRunningTime="2026-01-30 21:18:15.65606611 +0000 UTC m=+146.809212248" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.665106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" event={"ID":"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79","Type":"ContainerStarted","Data":"a37228c5ab932971444d65a63621ae5c91976e9c5d3490931f53f9351415d45c"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.665165 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" event={"ID":"a57e91ba-e053-4f9d-bf7f-3a5e0f400d79","Type":"ContainerStarted","Data":"165ab442e2674690512e9d58e4ce2080773839e5ab873c677cc43c9491a52bb9"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.671346 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6n9gz" podStartSLOduration=125.671328178 podStartE2EDuration="2m5.671328178s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.670633648 +0000 UTC m=+146.823779786" watchObservedRunningTime="2026-01-30 21:18:15.671328178 +0000 UTC m=+146.824474316" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.678907 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.681512 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.181499168 +0000 UTC m=+147.334645306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.684734 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" event={"ID":"15134023-081c-4cdd-bf87-dc07d86f3dfd","Type":"ContainerStarted","Data":"bb25c554b42d1f109dea9b0735c67dc7fcfae14b82174b81c3860ffd502aa5a1"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.685360 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.712251 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" event={"ID":"42c52fd3-a2bb-4345-b047-592d8525bf76","Type":"ContainerStarted","Data":"09c85ba8e362e7613388a70d3b3052d047b69a1d353713a044555a37b2455f94"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.712302 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" event={"ID":"42c52fd3-a2bb-4345-b047-592d8525bf76","Type":"ContainerStarted","Data":"c9942542cc17682633ddef8e3c91c51eba7b62ce1cedc6f91ee4f593b7caf8de"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.714499 4834 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-lmpvg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.714539 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" podUID="15134023-081c-4cdd-bf87-dc07d86f3dfd" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.717744 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x44bc" podStartSLOduration=125.717729363 podStartE2EDuration="2m5.717729363s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.715303592 +0000 UTC m=+146.868449730" watchObservedRunningTime="2026-01-30 21:18:15.717729363 +0000 UTC m=+146.870875501" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.723681 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" event={"ID":"99b4691a-d300-46e1-9ca1-ea8287465be8","Type":"ContainerStarted","Data":"8f6ad91e9e8f58623522acd14499c3df1efaa77bc157d6cc69a44da6e9e4adab"} Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.723731 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.761359 4834 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cmds5 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.761444 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" podUID="99b4691a-d300-46e1-9ca1-ea8287465be8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.780574 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.782078 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.282061845 +0000 UTC m=+147.435207983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.797031 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" podStartSLOduration=125.797009504 podStartE2EDuration="2m5.797009504s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.782435956 +0000 UTC m=+146.935582084" watchObservedRunningTime="2026-01-30 21:18:15.797009504 +0000 UTC m=+146.950155642" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.798325 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-25hvk" podStartSLOduration=126.798318493 podStartE2EDuration="2m6.798318493s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.761772238 +0000 UTC m=+146.914918376" watchObservedRunningTime="2026-01-30 21:18:15.798318493 +0000 UTC m=+146.951464631" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.823905 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" podStartSLOduration=125.823889185 podStartE2EDuration="2m5.823889185s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.823091061 +0000 UTC m=+146.976237199" watchObservedRunningTime="2026-01-30 21:18:15.823889185 +0000 UTC m=+146.977035323" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.849638 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-88rvm" podStartSLOduration=126.849620602 podStartE2EDuration="2m6.849620602s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.848673644 +0000 UTC m=+147.001819782" watchObservedRunningTime="2026-01-30 21:18:15.849620602 +0000 UTC m=+147.002766740" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.875948 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" podStartSLOduration=125.875932615 podStartE2EDuration="2m5.875932615s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.875720029 +0000 UTC m=+147.028866167" watchObservedRunningTime="2026-01-30 21:18:15.875932615 +0000 UTC m=+147.029078753" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.882707 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.883005 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.382993343 +0000 UTC m=+147.536139481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.971942 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kx4ct" podStartSLOduration=126.971915518 podStartE2EDuration="2m6.971915518s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.90534964 +0000 UTC m=+147.058495778" watchObservedRunningTime="2026-01-30 21:18:15.971915518 +0000 UTC m=+147.125061646" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.973807 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" podStartSLOduration=125.973799253 podStartE2EDuration="2m5.973799253s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.968980562 +0000 UTC m=+147.122126700" watchObservedRunningTime="2026-01-30 21:18:15.973799253 +0000 UTC m=+147.126945391" Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.983606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.983756 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.483724305 +0000 UTC m=+147.636870443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.983926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:15 crc kubenswrapper[4834]: E0130 21:18:15.984434 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.484419746 +0000 UTC m=+147.637565884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:15 crc kubenswrapper[4834]: I0130 21:18:15.998058 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" podStartSLOduration=125.998032126 podStartE2EDuration="2m5.998032126s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:15.996946144 +0000 UTC m=+147.150092282" watchObservedRunningTime="2026-01-30 21:18:15.998032126 +0000 UTC m=+147.151178264" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.022980 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.023047 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.024614 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pq6dw" podStartSLOduration=126.024591867 podStartE2EDuration="2m6.024591867s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:16.022459234 +0000 UTC m=+147.175605372" watchObservedRunningTime="2026-01-30 21:18:16.024591867 +0000 UTC m=+147.177738005" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.027542 4834 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wp8vc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.027594 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" podUID="baf610cb-71c9-4589-8aa1-74bf3030485b" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.086608 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:16 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:16 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:16 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.086666 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.088733 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.088862 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.588843986 +0000 UTC m=+147.741990124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.089286 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7x57c" podStartSLOduration=126.089275419 podStartE2EDuration="2m6.089275419s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:16.089069943 +0000 UTC m=+147.242216081" watchObservedRunningTime="2026-01-30 21:18:16.089275419 +0000 UTC m=+147.242421557" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.089377 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.089387 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9c6nm" podStartSLOduration=8.089382792 podStartE2EDuration="8.089382792s" podCreationTimestamp="2026-01-30 21:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:16.059301288 +0000 UTC m=+147.212447426" watchObservedRunningTime="2026-01-30 21:18:16.089382792 +0000 UTC m=+147.242528930" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.089629 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.589618649 +0000 UTC m=+147.742764787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.181093 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.181145 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.183690 4834 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-fqv56 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.183782 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" podUID="cbcfc442-8f5f-4da5-9289-98484d7c0cb3" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.190662 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.190863 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.690840636 +0000 UTC m=+147.843986774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.190922 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.191559 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.691552097 +0000 UTC m=+147.844698235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.209537 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vpnmb" podStartSLOduration=127.209512275 podStartE2EDuration="2m7.209512275s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:16.156266829 +0000 UTC m=+147.309412967" watchObservedRunningTime="2026-01-30 21:18:16.209512275 +0000 UTC m=+147.362658413" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.209643 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gr5pq" podStartSLOduration=126.209638239 podStartE2EDuration="2m6.209638239s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:16.205539218 +0000 UTC m=+147.358685356" watchObservedRunningTime="2026-01-30 21:18:16.209638239 +0000 UTC m=+147.362784377" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.293993 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.294351 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.794336459 +0000 UTC m=+147.947482597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.397034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.397418 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.89738555 +0000 UTC m=+148.050531688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.498621 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.498816 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.998791112 +0000 UTC m=+148.151937250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.499037 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.499351 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:16.999343248 +0000 UTC m=+148.152489386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.600647 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.600825 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.100793192 +0000 UTC m=+148.253939330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.600927 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.601237 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.101229264 +0000 UTC m=+148.254375402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.701873 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.702067 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.202040919 +0000 UTC m=+148.355187057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.702117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.702420 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.2024087 +0000 UTC m=+148.355554838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.730061 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" event={"ID":"12d3ad25-9e32-4467-8a93-43dfee213499","Type":"ContainerStarted","Data":"f7a2c5d585a8731c2b5e678d658e87d8ab25974007eb53e34553cfcc8c6344e8"} Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.731819 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9c6nm" event={"ID":"00f4caf0-ad7c-4a5b-8a86-60bfb39c6f2a","Type":"ContainerStarted","Data":"fdea63b0324e8c16cccbf9c3f6caf735d7e51ac38f1dba663cae3a7a5c282461"} Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.733492 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" event={"ID":"03fad3ad-d98f-420e-be43-8081c20dd6d4","Type":"ContainerStarted","Data":"04639297cc075d5bba4efa00ec0fa3678cbaf91bb0ce0916728bf16ce842a470"} Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.734180 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-25hvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.734248 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-25hvk" podUID="aae73693-2f17-4d81-9e1e-f510035bd84f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.734417 4834 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9kdrb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.734450 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.776138 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-lmpvg" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.801999 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cmds5" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.802879 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.803040 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.303021099 +0000 UTC m=+148.456167237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.803234 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.803580 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.303570795 +0000 UTC m=+148.456716933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.807977 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4ngsh" podStartSLOduration=126.807961824 podStartE2EDuration="2m6.807961824s" podCreationTimestamp="2026-01-30 21:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:16.761276291 +0000 UTC m=+147.914422429" watchObservedRunningTime="2026-01-30 21:18:16.807961824 +0000 UTC m=+147.961107962" Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.908448 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.908643 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.408616974 +0000 UTC m=+148.561763112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:16 crc kubenswrapper[4834]: I0130 21:18:16.908713 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:16 crc kubenswrapper[4834]: E0130 21:18:16.909037 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.409028976 +0000 UTC m=+148.562175114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.009916 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.010185 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.51015693 +0000 UTC m=+148.663303068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.010561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.010857 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.51084532 +0000 UTC m=+148.663991458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.039770 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4rjd7" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.077938 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:17 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:17 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:17 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.077984 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.111477 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.111821 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.611797329 +0000 UTC m=+148.764943467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.213660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.214047 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.714031035 +0000 UTC m=+148.867177173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.315054 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.315233 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.815208251 +0000 UTC m=+148.968354389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.315285 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.315591 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.815579092 +0000 UTC m=+148.968725230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.416735 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.416926 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.916894851 +0000 UTC m=+149.070040989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.416956 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.417279 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:17.917272042 +0000 UTC m=+149.070418180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.518564 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.518771 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:18.018743226 +0000 UTC m=+149.171889364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.518829 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.519135 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:18.019122047 +0000 UTC m=+149.172268185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.620589 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.620781 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:18.120755996 +0000 UTC m=+149.273902134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.621045 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.621085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.621107 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.621180 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.621203 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.621444 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:18.121427986 +0000 UTC m=+149.274574124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.621928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.626802 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.632949 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.644231 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.658833 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.676418 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.723100 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.723728 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-30 21:18:18.223709503 +0000 UTC m=+149.376855641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.733805 4834 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.738471 4834 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rtdcj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.738536 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" podUID="28198978-fb02-4208-9967-2c6ac9258439" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.756048 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3246e84-3def-488f-8a8f-069bdc3fa563" containerID="02840464958cec4c67536767b1b7a8fa47afb3116df59f556c51d1b87308f474" exitCode=0 Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.756107 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" event={"ID":"f3246e84-3def-488f-8a8f-069bdc3fa563","Type":"ContainerDied","Data":"02840464958cec4c67536767b1b7a8fa47afb3116df59f556c51d1b87308f474"} Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.770007 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-82j9j"] Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.770943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.772649 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.776721 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.792983 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82j9j"] Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.797460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" event={"ID":"03fad3ad-d98f-420e-be43-8081c20dd6d4","Type":"ContainerStarted","Data":"14d1a0c77bc451079781152715d5b0bc20c5fed28a0bc83c2521c8e42b1df948"} Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.797775 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" event={"ID":"03fad3ad-d98f-420e-be43-8081c20dd6d4","Type":"ContainerStarted","Data":"ce4db76a3e74a4b6565dc708e84c9e7216814084faf4f339fa4dadd23a06e664"} Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.799937 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-25hvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.800048 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-25hvk" podUID="aae73693-2f17-4d81-9e1e-f510035bd84f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.824703 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/d19012b5-546c-4af0-a419-f57194f5eff8-kube-api-access-d9mqv\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.824941 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-utilities\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.825031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-catalog-content\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.825157 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:17 crc kubenswrapper[4834]: E0130 21:18:17.825509 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-30 21:18:18.325496567 +0000 UTC m=+149.478642695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pfvpm" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.865765 4834 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-30T21:18:17.733832081Z","Handler":null,"Name":""} Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.893649 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rtdcj" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.914003 4834 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.914038 4834 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.929020 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.929320 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/d19012b5-546c-4af0-a419-f57194f5eff8-kube-api-access-d9mqv\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.929363 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-utilities\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.929423 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-catalog-content\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.931220 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-utilities\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.932676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-catalog-content\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.954499 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:18:17 crc kubenswrapper[4834]: I0130 21:18:17.979512 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/d19012b5-546c-4af0-a419-f57194f5eff8-kube-api-access-d9mqv\") pod \"certified-operators-82j9j\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.030327 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.074574 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:18 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:18 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:18 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.074624 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.084740 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.084785 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.098681 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.148875 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4sz2l"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.162280 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.169196 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sz2l"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.207703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pfvpm\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.236266 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-catalog-content\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.236349 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlbdl\" (UniqueName: \"kubernetes.io/projected/768fba83-c2e4-401a-81c8-ad4ecec9dac7-kube-api-access-dlbdl\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.236381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-utilities\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.262445 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.337803 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlbdl\" (UniqueName: \"kubernetes.io/projected/768fba83-c2e4-401a-81c8-ad4ecec9dac7-kube-api-access-dlbdl\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.337859 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-utilities\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.337899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-catalog-content\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.338590 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-catalog-content\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.339120 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-utilities\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.349781 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-blc9b"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.355509 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.360964 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.369019 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlbdl\" (UniqueName: \"kubernetes.io/projected/768fba83-c2e4-401a-81c8-ad4ecec9dac7-kube-api-access-dlbdl\") pod \"certified-operators-4sz2l\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.385451 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-blc9b"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.438884 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-catalog-content\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.438941 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbj9z\" (UniqueName: \"kubernetes.io/projected/be0cb498-ae6b-47f1-8068-9f7653206006-kube-api-access-sbj9z\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.439033 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-utilities\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.475953 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-82j9j"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.511555 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.539716 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-utilities\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.539772 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-catalog-content\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.539799 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbj9z\" (UniqueName: \"kubernetes.io/projected/be0cb498-ae6b-47f1-8068-9f7653206006-kube-api-access-sbj9z\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.540678 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-catalog-content\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.546832 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-utilities\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.546985 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4lpsm"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.547937 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.560663 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbj9z\" (UniqueName: \"kubernetes.io/projected/be0cb498-ae6b-47f1-8068-9f7653206006-kube-api-access-sbj9z\") pod \"community-operators-blc9b\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.605331 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4lpsm"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.630438 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pfvpm"] Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.642258 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-utilities\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.642338 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7gxb\" (UniqueName: \"kubernetes.io/projected/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-kube-api-access-w7gxb\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.642376 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-catalog-content\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: W0130 21:18:18.645501 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58995455_5f53_49bb_84e7_dab094ffec5b.slice/crio-5375b2fb94667eb26ba79cd698522217b7d93328e7f1941aebb3cb964a060863 WatchSource:0}: Error finding container 5375b2fb94667eb26ba79cd698522217b7d93328e7f1941aebb3cb964a060863: Status 404 returned error can't find the container with id 5375b2fb94667eb26ba79cd698522217b7d93328e7f1941aebb3cb964a060863 Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.715722 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.743408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7gxb\" (UniqueName: \"kubernetes.io/projected/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-kube-api-access-w7gxb\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.743463 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-catalog-content\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.743502 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-utilities\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.743903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-utilities\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.744006 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-catalog-content\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.761487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7gxb\" (UniqueName: \"kubernetes.io/projected/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-kube-api-access-w7gxb\") pod \"community-operators-4lpsm\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.805747 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" event={"ID":"58995455-5f53-49bb-84e7-dab094ffec5b","Type":"ContainerStarted","Data":"5375b2fb94667eb26ba79cd698522217b7d93328e7f1941aebb3cb964a060863"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.808076 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" event={"ID":"03fad3ad-d98f-420e-be43-8081c20dd6d4","Type":"ContainerStarted","Data":"9478aa8b68551dbd02288b205d1486a3e9cac43956c24bc943639615eb510939"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.811162 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5e7c01d966d99c8ec7e567a08b89691e87499b5d8f23d21ee986b447879e4437"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.811188 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"00401fc7f77d59c3138b22ea864584a88182c74730e68cb4cbdde1e5dd1e1a4c"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.813207 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8849f56787babf5f7e367362d34cf25123a11ad4c73f41083c582f1d29a92aa9"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.813230 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bff078897a51f65088e4596fe203bd2af2a46f40afb52a3c1b6f0ba23d5d34c5"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.814887 4834 generic.go:334] "Generic (PLEG): container finished" podID="d19012b5-546c-4af0-a419-f57194f5eff8" containerID="13862b67399434f77a503e0853c1f982148cace42f650e586eb46ea02172c7ae" exitCode=0 Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.814932 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82j9j" event={"ID":"d19012b5-546c-4af0-a419-f57194f5eff8","Type":"ContainerDied","Data":"13862b67399434f77a503e0853c1f982148cace42f650e586eb46ea02172c7ae"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.814947 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82j9j" event={"ID":"d19012b5-546c-4af0-a419-f57194f5eff8","Type":"ContainerStarted","Data":"d68c8c55ed074bef04bd26ea319dd361389622fe8a4db7d5fd8768e0132e6ad0"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.817122 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.817569 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6fa7298f9f4a83affa9c53ba79f310dbe86d787a718dcb8caa5e7554176b4599"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.817618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4972578e2f9cf4b29def729fa80af51c8fb535422cddf3d870319d9dadaa7511"} Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.818056 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.822507 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gzhfr" podStartSLOduration=10.822488826 podStartE2EDuration="10.822488826s" podCreationTimestamp="2026-01-30 21:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:18.821216098 +0000 UTC m=+149.974362236" watchObservedRunningTime="2026-01-30 21:18:18.822488826 +0000 UTC m=+149.975634954" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.888712 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:18 crc kubenswrapper[4834]: I0130 21:18:18.921487 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-blc9b"] Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.021175 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4sz2l"] Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.082810 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:19 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:19 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:19 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.083176 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.129493 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.198923 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4lpsm"] Jan 30 21:18:19 crc kubenswrapper[4834]: W0130 21:18:19.222994 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd80c77f_7d28_4be9_ac43_d8b7e09872e3.slice/crio-03aaae38ae215aa94881f4daeef3cec3868f8d56f6738ee16890d0455b1e6556 WatchSource:0}: Error finding container 03aaae38ae215aa94881f4daeef3cec3868f8d56f6738ee16890d0455b1e6556: Status 404 returned error can't find the container with id 03aaae38ae215aa94881f4daeef3cec3868f8d56f6738ee16890d0455b1e6556 Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.259328 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3246e84-3def-488f-8a8f-069bdc3fa563-config-volume\") pod \"f3246e84-3def-488f-8a8f-069bdc3fa563\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.259633 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrwl\" (UniqueName: \"kubernetes.io/projected/f3246e84-3def-488f-8a8f-069bdc3fa563-kube-api-access-2lrwl\") pod \"f3246e84-3def-488f-8a8f-069bdc3fa563\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.259733 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3246e84-3def-488f-8a8f-069bdc3fa563-secret-volume\") pod \"f3246e84-3def-488f-8a8f-069bdc3fa563\" (UID: \"f3246e84-3def-488f-8a8f-069bdc3fa563\") " Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.260101 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3246e84-3def-488f-8a8f-069bdc3fa563-config-volume" (OuterVolumeSpecName: "config-volume") pod "f3246e84-3def-488f-8a8f-069bdc3fa563" (UID: "f3246e84-3def-488f-8a8f-069bdc3fa563"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.265972 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3246e84-3def-488f-8a8f-069bdc3fa563-kube-api-access-2lrwl" (OuterVolumeSpecName: "kube-api-access-2lrwl") pod "f3246e84-3def-488f-8a8f-069bdc3fa563" (UID: "f3246e84-3def-488f-8a8f-069bdc3fa563"). InnerVolumeSpecName "kube-api-access-2lrwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.269010 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3246e84-3def-488f-8a8f-069bdc3fa563-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f3246e84-3def-488f-8a8f-069bdc3fa563" (UID: "f3246e84-3def-488f-8a8f-069bdc3fa563"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.361686 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lrwl\" (UniqueName: \"kubernetes.io/projected/f3246e84-3def-488f-8a8f-069bdc3fa563-kube-api-access-2lrwl\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.361806 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3246e84-3def-488f-8a8f-069bdc3fa563-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.361877 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3246e84-3def-488f-8a8f-069bdc3fa563-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.447464 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 21:18:19 crc kubenswrapper[4834]: E0130 21:18:19.447738 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3246e84-3def-488f-8a8f-069bdc3fa563" containerName="collect-profiles" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.447754 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3246e84-3def-488f-8a8f-069bdc3fa563" containerName="collect-profiles" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.447855 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3246e84-3def-488f-8a8f-069bdc3fa563" containerName="collect-profiles" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.448276 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.455860 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.455947 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.474552 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.548288 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.566125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc4447a-6898-4437-b9a9-fa907be9e74b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.566313 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc4447a-6898-4437-b9a9-fa907be9e74b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.668433 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc4447a-6898-4437-b9a9-fa907be9e74b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.668544 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc4447a-6898-4437-b9a9-fa907be9e74b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.668624 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc4447a-6898-4437-b9a9-fa907be9e74b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.693958 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc4447a-6898-4437-b9a9-fa907be9e74b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.826133 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.834714 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" event={"ID":"58995455-5f53-49bb-84e7-dab094ffec5b","Type":"ContainerStarted","Data":"8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.834913 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.837379 4834 generic.go:334] "Generic (PLEG): container finished" podID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerID="acd37b106904830e3568d9a76dbdcb05796535b9e44f91dc6e62cf6018d23056" exitCode=0 Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.837441 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lpsm" event={"ID":"cd80c77f-7d28-4be9-ac43-d8b7e09872e3","Type":"ContainerDied","Data":"acd37b106904830e3568d9a76dbdcb05796535b9e44f91dc6e62cf6018d23056"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.837501 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lpsm" event={"ID":"cd80c77f-7d28-4be9-ac43-d8b7e09872e3","Type":"ContainerStarted","Data":"03aaae38ae215aa94881f4daeef3cec3868f8d56f6738ee16890d0455b1e6556"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.838754 4834 generic.go:334] "Generic (PLEG): container finished" podID="be0cb498-ae6b-47f1-8068-9f7653206006" containerID="097ebf03473446b13c8c0c16a95615348ccf36ae363dc047dbbaf66f8520bb9d" exitCode=0 Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.838859 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerDied","Data":"097ebf03473446b13c8c0c16a95615348ccf36ae363dc047dbbaf66f8520bb9d"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.838920 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerStarted","Data":"aaecc91b58d80c614022df9a75dc08eae3e7e7e69d7b9aaec5ee621c8ce04644"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.841044 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" event={"ID":"f3246e84-3def-488f-8a8f-069bdc3fa563","Type":"ContainerDied","Data":"028c706b318ce0de3464448e638c0baed6b4615d2a86188be51942ad871faaf3"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.841086 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="028c706b318ce0de3464448e638c0baed6b4615d2a86188be51942ad871faaf3" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.841193 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2" Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.843179 4834 generic.go:334] "Generic (PLEG): container finished" podID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerID="7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a" exitCode=0 Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.843941 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sz2l" event={"ID":"768fba83-c2e4-401a-81c8-ad4ecec9dac7","Type":"ContainerDied","Data":"7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.843986 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sz2l" event={"ID":"768fba83-c2e4-401a-81c8-ad4ecec9dac7","Type":"ContainerStarted","Data":"d2e5e5fbbb2a844194506a70ff916d9b05905ce9a106e138f4ec380d16817e38"} Jan 30 21:18:19 crc kubenswrapper[4834]: I0130 21:18:19.862143 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" podStartSLOduration=130.862123898 podStartE2EDuration="2m10.862123898s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:19.859769908 +0000 UTC m=+151.012916046" watchObservedRunningTime="2026-01-30 21:18:19.862123898 +0000 UTC m=+151.015270036" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.079061 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:20 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:20 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:20 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.079360 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.167122 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9q6bh"] Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.168359 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.171736 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.178181 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q6bh"] Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.287553 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-catalog-content\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.287608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq88h\" (UniqueName: \"kubernetes.io/projected/433db183-f17f-4f55-b6f4-901614906a48-kube-api-access-xq88h\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.287634 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-utilities\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.388695 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-catalog-content\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.388766 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq88h\" (UniqueName: \"kubernetes.io/projected/433db183-f17f-4f55-b6f4-901614906a48-kube-api-access-xq88h\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.388799 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-utilities\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.389345 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-catalog-content\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.389375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-utilities\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.431795 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq88h\" (UniqueName: \"kubernetes.io/projected/433db183-f17f-4f55-b6f4-901614906a48-kube-api-access-xq88h\") pod \"redhat-marketplace-9q6bh\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.445109 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 30 21:18:20 crc kubenswrapper[4834]: W0130 21:18:20.454559 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9cc4447a_6898_4437_b9a9_fa907be9e74b.slice/crio-323fa3653b82d0fe2ca9df55f639a31b18e60b75ccc2d59f525f3b53862b36ae WatchSource:0}: Error finding container 323fa3653b82d0fe2ca9df55f639a31b18e60b75ccc2d59f525f3b53862b36ae: Status 404 returned error can't find the container with id 323fa3653b82d0fe2ca9df55f639a31b18e60b75ccc2d59f525f3b53862b36ae Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.489066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.547088 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwzrw"] Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.551722 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.555141 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwzrw"] Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.694105 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-utilities\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.694202 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwbq\" (UniqueName: \"kubernetes.io/projected/ae936d72-5b36-46cb-a845-714057922b0e-kube-api-access-bqwbq\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.694233 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-catalog-content\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.795891 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwbq\" (UniqueName: \"kubernetes.io/projected/ae936d72-5b36-46cb-a845-714057922b0e-kube-api-access-bqwbq\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.796192 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-catalog-content\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.796223 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-utilities\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.796724 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-utilities\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.797215 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-catalog-content\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.814242 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwbq\" (UniqueName: \"kubernetes.io/projected/ae936d72-5b36-46cb-a845-714057922b0e-kube-api-access-bqwbq\") pod \"redhat-marketplace-kwzrw\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.851995 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9cc4447a-6898-4437-b9a9-fa907be9e74b","Type":"ContainerStarted","Data":"6021a4eeadae376e01513d7b59e33ee0bd12bffbb3c4877e09ee52c26aa2f9cf"} Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.852035 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9cc4447a-6898-4437-b9a9-fa907be9e74b","Type":"ContainerStarted","Data":"323fa3653b82d0fe2ca9df55f639a31b18e60b75ccc2d59f525f3b53862b36ae"} Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.856383 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9x6k8" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.866573 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.8665531450000001 podStartE2EDuration="1.866553145s" podCreationTimestamp="2026-01-30 21:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:20.866132933 +0000 UTC m=+152.019279071" watchObservedRunningTime="2026-01-30 21:18:20.866553145 +0000 UTC m=+152.019699283" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.909934 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:18:20 crc kubenswrapper[4834]: I0130 21:18:20.983547 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q6bh"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.029768 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.035718 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wp8vc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.094942 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:21 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:21 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:21 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.094989 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.143229 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xpwnw"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.157569 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.164087 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpwnw"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.164376 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.186272 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwzrw"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.193933 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.220018 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fqv56" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.272038 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.273079 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.278887 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.279161 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.291747 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.308532 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-utilities\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.308616 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-catalog-content\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.308655 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdkr\" (UniqueName: \"kubernetes.io/projected/864b4107-52a4-4db4-a6f7-ca80d4122d26-kube-api-access-qjdkr\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.409903 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-utilities\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.409959 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5032e422-75b9-488e-af5f-98b63ac948d9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.409996 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5032e422-75b9-488e-af5f-98b63ac948d9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.410019 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-catalog-content\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.410045 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdkr\" (UniqueName: \"kubernetes.io/projected/864b4107-52a4-4db4-a6f7-ca80d4122d26-kube-api-access-qjdkr\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.410745 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-utilities\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.411002 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-catalog-content\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.439779 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdkr\" (UniqueName: \"kubernetes.io/projected/864b4107-52a4-4db4-a6f7-ca80d4122d26-kube-api-access-qjdkr\") pod \"redhat-operators-xpwnw\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.510814 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5032e422-75b9-488e-af5f-98b63ac948d9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.510873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5032e422-75b9-488e-af5f-98b63ac948d9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.510956 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5032e422-75b9-488e-af5f-98b63ac948d9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.547166 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.547700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5032e422-75b9-488e-af5f-98b63ac948d9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.548806 4834 patch_prober.go:28] interesting pod/console-f9d7485db-4k6d4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.548867 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4k6d4" podUID="59b3b974-9ba8-426b-8836-34ecbb56f86f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.584029 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hgnmx"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.585006 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.585024 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.585034 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgnmx"] Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.585647 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.642806 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.719598 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-utilities\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.719659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lpbh\" (UniqueName: \"kubernetes.io/projected/54d43e08-9a17-464b-8808-5bce5d3502d0-kube-api-access-8lpbh\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.719702 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-catalog-content\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.811918 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-25hvk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.811966 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-25hvk" podUID="aae73693-2f17-4d81-9e1e-f510035bd84f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.812328 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-25hvk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.812348 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-25hvk" podUID="aae73693-2f17-4d81-9e1e-f510035bd84f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.820708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lpbh\" (UniqueName: \"kubernetes.io/projected/54d43e08-9a17-464b-8808-5bce5d3502d0-kube-api-access-8lpbh\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.820759 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-catalog-content\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.820828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-utilities\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.821215 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-utilities\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.821714 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-catalog-content\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.841478 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lpbh\" (UniqueName: \"kubernetes.io/projected/54d43e08-9a17-464b-8808-5bce5d3502d0-kube-api-access-8lpbh\") pod \"redhat-operators-hgnmx\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.900659 4834 generic.go:334] "Generic (PLEG): container finished" podID="9cc4447a-6898-4437-b9a9-fa907be9e74b" containerID="6021a4eeadae376e01513d7b59e33ee0bd12bffbb3c4877e09ee52c26aa2f9cf" exitCode=0 Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.901033 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9cc4447a-6898-4437-b9a9-fa907be9e74b","Type":"ContainerDied","Data":"6021a4eeadae376e01513d7b59e33ee0bd12bffbb3c4877e09ee52c26aa2f9cf"} Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.915509 4834 generic.go:334] "Generic (PLEG): container finished" podID="433db183-f17f-4f55-b6f4-901614906a48" containerID="7c1015e35b3686baab7cbc7116ac76be04162889325e2e44542f16bdb003a52e" exitCode=0 Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.915589 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerDied","Data":"7c1015e35b3686baab7cbc7116ac76be04162889325e2e44542f16bdb003a52e"} Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.915619 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerStarted","Data":"cb10fd780459218ab3d2e6c78d2be58c0ab75af312e72887b69f184bfd45bf3f"} Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.916292 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.918311 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae936d72-5b36-46cb-a845-714057922b0e" containerID="91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e" exitCode=0 Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.919099 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwzrw" event={"ID":"ae936d72-5b36-46cb-a845-714057922b0e","Type":"ContainerDied","Data":"91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e"} Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.919125 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwzrw" event={"ID":"ae936d72-5b36-46cb-a845-714057922b0e","Type":"ContainerStarted","Data":"78fba0371b180944601c83095db12831fa7592a294b8641f37e821d49a3ce2ce"} Jan 30 21:18:21 crc kubenswrapper[4834]: I0130 21:18:21.979740 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xpwnw"] Jan 30 21:18:22 crc kubenswrapper[4834]: W0130 21:18:22.067312 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod864b4107_52a4_4db4_a6f7_ca80d4122d26.slice/crio-305828a82b1d084818d6d9ec10dc44b438a6ef75516fd8ef37186d7affeb1a44 WatchSource:0}: Error finding container 305828a82b1d084818d6d9ec10dc44b438a6ef75516fd8ef37186d7affeb1a44: Status 404 returned error can't find the container with id 305828a82b1d084818d6d9ec10dc44b438a6ef75516fd8ef37186d7affeb1a44 Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.072890 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.078866 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:22 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:22 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:22 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.079000 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.100755 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.283702 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 30 21:18:22 crc kubenswrapper[4834]: W0130 21:18:22.335228 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5032e422_75b9_488e_af5f_98b63ac948d9.slice/crio-194b4f23a31960209e499ad8e1b23985197ad503ceb292016ace376adef93559 WatchSource:0}: Error finding container 194b4f23a31960209e499ad8e1b23985197ad503ceb292016ace376adef93559: Status 404 returned error can't find the container with id 194b4f23a31960209e499ad8e1b23985197ad503ceb292016ace376adef93559 Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.589264 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hgnmx"] Jan 30 21:18:22 crc kubenswrapper[4834]: W0130 21:18:22.606665 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d43e08_9a17_464b_8808_5bce5d3502d0.slice/crio-926a7882bf58136c3cf604f80ead2818bf8a5495e694665dee16ae83a0abca51 WatchSource:0}: Error finding container 926a7882bf58136c3cf604f80ead2818bf8a5495e694665dee16ae83a0abca51: Status 404 returned error can't find the container with id 926a7882bf58136c3cf604f80ead2818bf8a5495e694665dee16ae83a0abca51 Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.957751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5032e422-75b9-488e-af5f-98b63ac948d9","Type":"ContainerStarted","Data":"194b4f23a31960209e499ad8e1b23985197ad503ceb292016ace376adef93559"} Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.967563 4834 generic.go:334] "Generic (PLEG): container finished" podID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerID="3510a33f834cae793af7308cc5588e6d27f3af17e6897c8dd94013cb0dba0cd4" exitCode=0 Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.967630 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpwnw" event={"ID":"864b4107-52a4-4db4-a6f7-ca80d4122d26","Type":"ContainerDied","Data":"3510a33f834cae793af7308cc5588e6d27f3af17e6897c8dd94013cb0dba0cd4"} Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.975524 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpwnw" event={"ID":"864b4107-52a4-4db4-a6f7-ca80d4122d26","Type":"ContainerStarted","Data":"305828a82b1d084818d6d9ec10dc44b438a6ef75516fd8ef37186d7affeb1a44"} Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.996635 4834 generic.go:334] "Generic (PLEG): container finished" podID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerID="953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450" exitCode=0 Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.997169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgnmx" event={"ID":"54d43e08-9a17-464b-8808-5bce5d3502d0","Type":"ContainerDied","Data":"953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450"} Jan 30 21:18:22 crc kubenswrapper[4834]: I0130 21:18:22.997248 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgnmx" event={"ID":"54d43e08-9a17-464b-8808-5bce5d3502d0","Type":"ContainerStarted","Data":"926a7882bf58136c3cf604f80ead2818bf8a5495e694665dee16ae83a0abca51"} Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.076499 4834 patch_prober.go:28] interesting pod/router-default-5444994796-vmqm2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 30 21:18:23 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Jan 30 21:18:23 crc kubenswrapper[4834]: [+]process-running ok Jan 30 21:18:23 crc kubenswrapper[4834]: healthz check failed Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.076823 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-vmqm2" podUID="57726227-3cf1-4553-9b60-63e2082c887d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.336879 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.455780 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc4447a-6898-4437-b9a9-fa907be9e74b-kube-api-access\") pod \"9cc4447a-6898-4437-b9a9-fa907be9e74b\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.455866 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc4447a-6898-4437-b9a9-fa907be9e74b-kubelet-dir\") pod \"9cc4447a-6898-4437-b9a9-fa907be9e74b\" (UID: \"9cc4447a-6898-4437-b9a9-fa907be9e74b\") " Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.457661 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cc4447a-6898-4437-b9a9-fa907be9e74b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9cc4447a-6898-4437-b9a9-fa907be9e74b" (UID: "9cc4447a-6898-4437-b9a9-fa907be9e74b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.463939 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc4447a-6898-4437-b9a9-fa907be9e74b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9cc4447a-6898-4437-b9a9-fa907be9e74b" (UID: "9cc4447a-6898-4437-b9a9-fa907be9e74b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.559064 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc4447a-6898-4437-b9a9-fa907be9e74b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:23 crc kubenswrapper[4834]: I0130 21:18:23.559105 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc4447a-6898-4437-b9a9-fa907be9e74b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.008478 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.008498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9cc4447a-6898-4437-b9a9-fa907be9e74b","Type":"ContainerDied","Data":"323fa3653b82d0fe2ca9df55f639a31b18e60b75ccc2d59f525f3b53862b36ae"} Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.008545 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="323fa3653b82d0fe2ca9df55f639a31b18e60b75ccc2d59f525f3b53862b36ae" Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.010723 4834 generic.go:334] "Generic (PLEG): container finished" podID="5032e422-75b9-488e-af5f-98b63ac948d9" containerID="3a7ea602fbf285a7bb6645c17d04d96c4bbe0241e9f3a0a742825255623b327d" exitCode=0 Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.010753 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5032e422-75b9-488e-af5f-98b63ac948d9","Type":"ContainerDied","Data":"3a7ea602fbf285a7bb6645c17d04d96c4bbe0241e9f3a0a742825255623b327d"} Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.075423 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:24 crc kubenswrapper[4834]: I0130 21:18:24.079149 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-vmqm2" Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.447411 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.503979 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5032e422-75b9-488e-af5f-98b63ac948d9-kube-api-access\") pod \"5032e422-75b9-488e-af5f-98b63ac948d9\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.504093 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5032e422-75b9-488e-af5f-98b63ac948d9-kubelet-dir\") pod \"5032e422-75b9-488e-af5f-98b63ac948d9\" (UID: \"5032e422-75b9-488e-af5f-98b63ac948d9\") " Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.504367 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5032e422-75b9-488e-af5f-98b63ac948d9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5032e422-75b9-488e-af5f-98b63ac948d9" (UID: "5032e422-75b9-488e-af5f-98b63ac948d9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.510573 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5032e422-75b9-488e-af5f-98b63ac948d9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5032e422-75b9-488e-af5f-98b63ac948d9" (UID: "5032e422-75b9-488e-af5f-98b63ac948d9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.605212 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5032e422-75b9-488e-af5f-98b63ac948d9-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:25 crc kubenswrapper[4834]: I0130 21:18:25.605251 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5032e422-75b9-488e-af5f-98b63ac948d9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:18:26 crc kubenswrapper[4834]: I0130 21:18:26.027876 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5032e422-75b9-488e-af5f-98b63ac948d9","Type":"ContainerDied","Data":"194b4f23a31960209e499ad8e1b23985197ad503ceb292016ace376adef93559"} Jan 30 21:18:26 crc kubenswrapper[4834]: I0130 21:18:26.027915 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="194b4f23a31960209e499ad8e1b23985197ad503ceb292016ace376adef93559" Jan 30 21:18:26 crc kubenswrapper[4834]: I0130 21:18:26.027970 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 30 21:18:27 crc kubenswrapper[4834]: I0130 21:18:27.184678 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9c6nm" Jan 30 21:18:31 crc kubenswrapper[4834]: I0130 21:18:31.549253 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:31 crc kubenswrapper[4834]: I0130 21:18:31.554236 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:18:31 crc kubenswrapper[4834]: I0130 21:18:31.815747 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-25hvk" Jan 30 21:18:32 crc kubenswrapper[4834]: I0130 21:18:32.631251 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:18:32 crc kubenswrapper[4834]: I0130 21:18:32.638050 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8a589ab-0e20-4c47-a923-363b3be97b20-metrics-certs\") pod \"network-metrics-daemon-j5pcw\" (UID: \"f8a589ab-0e20-4c47-a923-363b3be97b20\") " pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:18:32 crc kubenswrapper[4834]: I0130 21:18:32.759056 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j5pcw" Jan 30 21:18:34 crc kubenswrapper[4834]: I0130 21:18:34.160868 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:18:34 crc kubenswrapper[4834]: I0130 21:18:34.162194 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:18:38 crc kubenswrapper[4834]: I0130 21:18:38.268638 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:18:52 crc kubenswrapper[4834]: I0130 21:18:52.042241 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fq6x5" Jan 30 21:18:52 crc kubenswrapper[4834]: E0130 21:18:52.072535 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 30 21:18:52 crc kubenswrapper[4834]: E0130 21:18:52.072855 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqwbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kwzrw_openshift-marketplace(ae936d72-5b36-46cb-a845-714057922b0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:18:52 crc kubenswrapper[4834]: E0130 21:18:52.074555 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kwzrw" podUID="ae936d72-5b36-46cb-a845-714057922b0e" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.029425 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kwzrw" podUID="ae936d72-5b36-46cb-a845-714057922b0e" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.138638 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.138924 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qjdkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xpwnw_openshift-marketplace(864b4107-52a4-4db4-a6f7-ca80d4122d26): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.140889 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xpwnw" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.202651 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.202774 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lpbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hgnmx_openshift-marketplace(54d43e08-9a17-464b-8808-5bce5d3502d0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.203840 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hgnmx" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" Jan 30 21:18:55 crc kubenswrapper[4834]: I0130 21:18:55.307448 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerStarted","Data":"ad15784b169ba0765ddd0e6a8c3c31eab3bb862092376c18e86d793223da1d48"} Jan 30 21:18:55 crc kubenswrapper[4834]: I0130 21:18:55.311960 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerStarted","Data":"07a2a2b436de6881879ee6955630e0cacd032a8e0126365cce92f5e5b19498d4"} Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.312750 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hgnmx" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" Jan 30 21:18:55 crc kubenswrapper[4834]: E0130 21:18:55.313324 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xpwnw" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" Jan 30 21:18:55 crc kubenswrapper[4834]: I0130 21:18:55.496289 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j5pcw"] Jan 30 21:18:55 crc kubenswrapper[4834]: W0130 21:18:55.503509 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8a589ab_0e20_4c47_a923_363b3be97b20.slice/crio-a30ff597b04873c2c64dca228f9f6185b0416a8e546bab7870bc4ea80c104855 WatchSource:0}: Error finding container a30ff597b04873c2c64dca228f9f6185b0416a8e546bab7870bc4ea80c104855: Status 404 returned error can't find the container with id a30ff597b04873c2c64dca228f9f6185b0416a8e546bab7870bc4ea80c104855 Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.320616 4834 generic.go:334] "Generic (PLEG): container finished" podID="d19012b5-546c-4af0-a419-f57194f5eff8" containerID="c661d5053142de574fe904aa8ad3a217084af08e2a4e69992c48113343554d21" exitCode=0 Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.321226 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82j9j" event={"ID":"d19012b5-546c-4af0-a419-f57194f5eff8","Type":"ContainerDied","Data":"c661d5053142de574fe904aa8ad3a217084af08e2a4e69992c48113343554d21"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.327011 4834 generic.go:334] "Generic (PLEG): container finished" podID="433db183-f17f-4f55-b6f4-901614906a48" containerID="ad15784b169ba0765ddd0e6a8c3c31eab3bb862092376c18e86d793223da1d48" exitCode=0 Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.327075 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerDied","Data":"ad15784b169ba0765ddd0e6a8c3c31eab3bb862092376c18e86d793223da1d48"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.333186 4834 generic.go:334] "Generic (PLEG): container finished" podID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerID="da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9" exitCode=0 Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.333295 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sz2l" event={"ID":"768fba83-c2e4-401a-81c8-ad4ecec9dac7","Type":"ContainerDied","Data":"da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.337959 4834 generic.go:334] "Generic (PLEG): container finished" podID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerID="3ce00cca1d615c709d35d35a2854f3f21a375e248268d59229463bc354fdd2ff" exitCode=0 Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.338074 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lpsm" event={"ID":"cd80c77f-7d28-4be9-ac43-d8b7e09872e3","Type":"ContainerDied","Data":"3ce00cca1d615c709d35d35a2854f3f21a375e248268d59229463bc354fdd2ff"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.355843 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" event={"ID":"f8a589ab-0e20-4c47-a923-363b3be97b20","Type":"ContainerStarted","Data":"ad7aea91220a68b4f8a22cb1f4c7deea0f8ad76541840b75335bb3945567605b"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.355883 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" event={"ID":"f8a589ab-0e20-4c47-a923-363b3be97b20","Type":"ContainerStarted","Data":"f373746ae7fa39aa9b721871f71ffa945c00e07338d7b50b6308999168f9549a"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.355892 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j5pcw" event={"ID":"f8a589ab-0e20-4c47-a923-363b3be97b20","Type":"ContainerStarted","Data":"a30ff597b04873c2c64dca228f9f6185b0416a8e546bab7870bc4ea80c104855"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.361410 4834 generic.go:334] "Generic (PLEG): container finished" podID="be0cb498-ae6b-47f1-8068-9f7653206006" containerID="07a2a2b436de6881879ee6955630e0cacd032a8e0126365cce92f5e5b19498d4" exitCode=0 Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.361449 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerDied","Data":"07a2a2b436de6881879ee6955630e0cacd032a8e0126365cce92f5e5b19498d4"} Jan 30 21:18:56 crc kubenswrapper[4834]: I0130 21:18:56.458831 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j5pcw" podStartSLOduration=167.458815425 podStartE2EDuration="2m47.458815425s" podCreationTimestamp="2026-01-30 21:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:18:56.456783266 +0000 UTC m=+187.609929424" watchObservedRunningTime="2026-01-30 21:18:56.458815425 +0000 UTC m=+187.611961563" Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.370054 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sz2l" event={"ID":"768fba83-c2e4-401a-81c8-ad4ecec9dac7","Type":"ContainerStarted","Data":"d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe"} Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.382914 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerStarted","Data":"6d673b42d73d416821a96ffceadbda400138b8d30efdc72ef16245d229d08a67"} Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.385852 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82j9j" event={"ID":"d19012b5-546c-4af0-a419-f57194f5eff8","Type":"ContainerStarted","Data":"3ef5d92d4409fbb36784676c33dceff9c72e99e8c39c0d47370be281e229b9dc"} Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.392681 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4sz2l" podStartSLOduration=2.453139628 podStartE2EDuration="39.392663927s" podCreationTimestamp="2026-01-30 21:18:18 +0000 UTC" firstStartedPulling="2026-01-30 21:18:19.850883767 +0000 UTC m=+151.004029905" lastFinishedPulling="2026-01-30 21:18:56.790408066 +0000 UTC m=+187.943554204" observedRunningTime="2026-01-30 21:18:57.390049571 +0000 UTC m=+188.543195709" watchObservedRunningTime="2026-01-30 21:18:57.392663927 +0000 UTC m=+188.545810055" Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.408412 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-82j9j" podStartSLOduration=2.186779159 podStartE2EDuration="40.40838848s" podCreationTimestamp="2026-01-30 21:18:17 +0000 UTC" firstStartedPulling="2026-01-30 21:18:18.816877221 +0000 UTC m=+149.970023359" lastFinishedPulling="2026-01-30 21:18:57.038486502 +0000 UTC m=+188.191632680" observedRunningTime="2026-01-30 21:18:57.406050751 +0000 UTC m=+188.559196889" watchObservedRunningTime="2026-01-30 21:18:57.40838848 +0000 UTC m=+188.561534618" Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.424970 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-blc9b" podStartSLOduration=2.450455789 podStartE2EDuration="39.424947617s" podCreationTimestamp="2026-01-30 21:18:18 +0000 UTC" firstStartedPulling="2026-01-30 21:18:19.848539788 +0000 UTC m=+151.001685936" lastFinishedPulling="2026-01-30 21:18:56.823031626 +0000 UTC m=+187.976177764" observedRunningTime="2026-01-30 21:18:57.423927447 +0000 UTC m=+188.577073595" watchObservedRunningTime="2026-01-30 21:18:57.424947617 +0000 UTC m=+188.578093755" Jan 30 21:18:57 crc kubenswrapper[4834]: I0130 21:18:57.682914 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.099860 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.100231 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.392383 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerStarted","Data":"166a044c17c9fc3d0768bb76d46bb024212d7535f8f39ccc71869a751772619c"} Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.396070 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lpsm" event={"ID":"cd80c77f-7d28-4be9-ac43-d8b7e09872e3","Type":"ContainerStarted","Data":"ba62723a9beee86abecd60454bc47353529b036465cda5a4dd9c834e5882f07d"} Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.420658 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9q6bh" podStartSLOduration=3.060824352 podStartE2EDuration="38.420641717s" podCreationTimestamp="2026-01-30 21:18:20 +0000 UTC" firstStartedPulling="2026-01-30 21:18:21.919600862 +0000 UTC m=+153.072747000" lastFinishedPulling="2026-01-30 21:18:57.279418217 +0000 UTC m=+188.432564365" observedRunningTime="2026-01-30 21:18:58.419955537 +0000 UTC m=+189.573101675" watchObservedRunningTime="2026-01-30 21:18:58.420641717 +0000 UTC m=+189.573787855" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.438634 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4lpsm" podStartSLOduration=3.042892651 podStartE2EDuration="40.438615626s" podCreationTimestamp="2026-01-30 21:18:18 +0000 UTC" firstStartedPulling="2026-01-30 21:18:19.84825687 +0000 UTC m=+151.001403008" lastFinishedPulling="2026-01-30 21:18:57.243979845 +0000 UTC m=+188.397125983" observedRunningTime="2026-01-30 21:18:58.437996658 +0000 UTC m=+189.591142796" watchObservedRunningTime="2026-01-30 21:18:58.438615626 +0000 UTC m=+189.591761764" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.512626 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.512686 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.718770 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.718818 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.889467 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:58 crc kubenswrapper[4834]: I0130 21:18:58.889517 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:18:59 crc kubenswrapper[4834]: I0130 21:18:59.258899 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-82j9j" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:59 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:59 crc kubenswrapper[4834]: > Jan 30 21:18:59 crc kubenswrapper[4834]: I0130 21:18:59.553673 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4sz2l" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:59 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:59 crc kubenswrapper[4834]: > Jan 30 21:18:59 crc kubenswrapper[4834]: I0130 21:18:59.753178 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-blc9b" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:59 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:59 crc kubenswrapper[4834]: > Jan 30 21:18:59 crc kubenswrapper[4834]: I0130 21:18:59.921726 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4lpsm" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="registry-server" probeResult="failure" output=< Jan 30 21:18:59 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:18:59 crc kubenswrapper[4834]: > Jan 30 21:19:00 crc kubenswrapper[4834]: I0130 21:19:00.489304 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:19:00 crc kubenswrapper[4834]: I0130 21:19:00.489849 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:19:00 crc kubenswrapper[4834]: I0130 21:19:00.548501 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.672234 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 21:19:01 crc kubenswrapper[4834]: E0130 21:19:01.674479 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc4447a-6898-4437-b9a9-fa907be9e74b" containerName="pruner" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.674703 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc4447a-6898-4437-b9a9-fa907be9e74b" containerName="pruner" Jan 30 21:19:01 crc kubenswrapper[4834]: E0130 21:19:01.674968 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5032e422-75b9-488e-af5f-98b63ac948d9" containerName="pruner" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.675160 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5032e422-75b9-488e-af5f-98b63ac948d9" containerName="pruner" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.675613 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc4447a-6898-4437-b9a9-fa907be9e74b" containerName="pruner" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.675864 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5032e422-75b9-488e-af5f-98b63ac948d9" containerName="pruner" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.676796 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.678385 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.683689 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.684065 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.811279 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.811340 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.912059 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.912133 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.912189 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:01 crc kubenswrapper[4834]: I0130 21:19:01.937482 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:02 crc kubenswrapper[4834]: I0130 21:19:01.999359 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:02 crc kubenswrapper[4834]: I0130 21:19:02.420784 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 30 21:19:02 crc kubenswrapper[4834]: W0130 21:19:02.430687 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc941f94d_1e9b_4ab4_a78e_b25a38f86ff6.slice/crio-0a70554325abad24c45af7a078450f4d929601839854ad0e1b3c1331f18abded WatchSource:0}: Error finding container 0a70554325abad24c45af7a078450f4d929601839854ad0e1b3c1331f18abded: Status 404 returned error can't find the container with id 0a70554325abad24c45af7a078450f4d929601839854ad0e1b3c1331f18abded Jan 30 21:19:02 crc kubenswrapper[4834]: I0130 21:19:02.487765 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:19:03 crc kubenswrapper[4834]: I0130 21:19:03.440834 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6","Type":"ContainerStarted","Data":"b68445eaed8367ac15d698c42021eec82112fea60dfc967eb08591ddfde9d083"} Jan 30 21:19:03 crc kubenswrapper[4834]: I0130 21:19:03.441167 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6","Type":"ContainerStarted","Data":"0a70554325abad24c45af7a078450f4d929601839854ad0e1b3c1331f18abded"} Jan 30 21:19:03 crc kubenswrapper[4834]: I0130 21:19:03.453653 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.453635663 podStartE2EDuration="2.453635663s" podCreationTimestamp="2026-01-30 21:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:03.45182913 +0000 UTC m=+194.604975278" watchObservedRunningTime="2026-01-30 21:19:03.453635663 +0000 UTC m=+194.606781801" Jan 30 21:19:04 crc kubenswrapper[4834]: I0130 21:19:04.160877 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:19:04 crc kubenswrapper[4834]: I0130 21:19:04.160938 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:19:04 crc kubenswrapper[4834]: I0130 21:19:04.447569 4834 generic.go:334] "Generic (PLEG): container finished" podID="c941f94d-1e9b-4ab4-a78e-b25a38f86ff6" containerID="b68445eaed8367ac15d698c42021eec82112fea60dfc967eb08591ddfde9d083" exitCode=0 Jan 30 21:19:04 crc kubenswrapper[4834]: I0130 21:19:04.447610 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6","Type":"ContainerDied","Data":"b68445eaed8367ac15d698c42021eec82112fea60dfc967eb08591ddfde9d083"} Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.750564 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.761021 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kube-api-access\") pod \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.761097 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kubelet-dir\") pod \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\" (UID: \"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6\") " Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.761294 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c941f94d-1e9b-4ab4-a78e-b25a38f86ff6" (UID: "c941f94d-1e9b-4ab4-a78e-b25a38f86ff6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.776412 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c941f94d-1e9b-4ab4-a78e-b25a38f86ff6" (UID: "c941f94d-1e9b-4ab4-a78e-b25a38f86ff6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.862533 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:05 crc kubenswrapper[4834]: I0130 21:19:05.862572 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c941f94d-1e9b-4ab4-a78e-b25a38f86ff6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.459632 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c941f94d-1e9b-4ab4-a78e-b25a38f86ff6","Type":"ContainerDied","Data":"0a70554325abad24c45af7a078450f4d929601839854ad0e1b3c1331f18abded"} Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.459667 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a70554325abad24c45af7a078450f4d929601839854ad0e1b3c1331f18abded" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.459719 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.864134 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 21:19:06 crc kubenswrapper[4834]: E0130 21:19:06.864781 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c941f94d-1e9b-4ab4-a78e-b25a38f86ff6" containerName="pruner" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.864803 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c941f94d-1e9b-4ab4-a78e-b25a38f86ff6" containerName="pruner" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.865137 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c941f94d-1e9b-4ab4-a78e-b25a38f86ff6" containerName="pruner" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.866036 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.870511 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.870871 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.870898 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.978308 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kube-api-access\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.978845 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:06 crc kubenswrapper[4834]: I0130 21:19:06.978959 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-var-lock\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.080447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.080551 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-var-lock\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.080587 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kube-api-access\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.081018 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.081019 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-var-lock\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.098047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kube-api-access\") pod \"installer-9-crc\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.193626 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:07 crc kubenswrapper[4834]: I0130 21:19:07.585191 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.170213 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.216766 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.475076 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b","Type":"ContainerStarted","Data":"b7e7cec5cc85c855cf47812d31428bbc14a8029c3603805f5f42fcfa46620cf5"} Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.475118 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b","Type":"ContainerStarted","Data":"bda3294a0422731835d643521e55d982afc868663145da3d27a94dc32bd10168"} Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.564284 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.610584 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.770435 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.813729 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.928566 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:19:08 crc kubenswrapper[4834]: I0130 21:19:08.965601 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:19:09 crc kubenswrapper[4834]: I0130 21:19:09.148904 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4sz2l"] Jan 30 21:19:09 crc kubenswrapper[4834]: I0130 21:19:09.513008 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.5129802 podStartE2EDuration="3.5129802s" podCreationTimestamp="2026-01-30 21:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:19:09.509008493 +0000 UTC m=+200.662154671" watchObservedRunningTime="2026-01-30 21:19:09.5129802 +0000 UTC m=+200.666126378" Jan 30 21:19:10 crc kubenswrapper[4834]: I0130 21:19:10.493379 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4sz2l" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="registry-server" containerID="cri-o://d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe" gracePeriod=2 Jan 30 21:19:10 crc kubenswrapper[4834]: I0130 21:19:10.884351 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.034023 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-catalog-content\") pod \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.034117 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-utilities\") pod \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.034156 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlbdl\" (UniqueName: \"kubernetes.io/projected/768fba83-c2e4-401a-81c8-ad4ecec9dac7-kube-api-access-dlbdl\") pod \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\" (UID: \"768fba83-c2e4-401a-81c8-ad4ecec9dac7\") " Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.034926 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-utilities" (OuterVolumeSpecName: "utilities") pod "768fba83-c2e4-401a-81c8-ad4ecec9dac7" (UID: "768fba83-c2e4-401a-81c8-ad4ecec9dac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.039580 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768fba83-c2e4-401a-81c8-ad4ecec9dac7-kube-api-access-dlbdl" (OuterVolumeSpecName: "kube-api-access-dlbdl") pod "768fba83-c2e4-401a-81c8-ad4ecec9dac7" (UID: "768fba83-c2e4-401a-81c8-ad4ecec9dac7"). InnerVolumeSpecName "kube-api-access-dlbdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.086522 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "768fba83-c2e4-401a-81c8-ad4ecec9dac7" (UID: "768fba83-c2e4-401a-81c8-ad4ecec9dac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.135436 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.135491 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/768fba83-c2e4-401a-81c8-ad4ecec9dac7-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.135509 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlbdl\" (UniqueName: \"kubernetes.io/projected/768fba83-c2e4-401a-81c8-ad4ecec9dac7-kube-api-access-dlbdl\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.503105 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae936d72-5b36-46cb-a845-714057922b0e" containerID="35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5" exitCode=0 Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.503169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwzrw" event={"ID":"ae936d72-5b36-46cb-a845-714057922b0e","Type":"ContainerDied","Data":"35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5"} Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.507046 4834 generic.go:334] "Generic (PLEG): container finished" podID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerID="8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750" exitCode=0 Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.507094 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgnmx" event={"ID":"54d43e08-9a17-464b-8808-5bce5d3502d0","Type":"ContainerDied","Data":"8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750"} Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.511637 4834 generic.go:334] "Generic (PLEG): container finished" podID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerID="d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe" exitCode=0 Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.511675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sz2l" event={"ID":"768fba83-c2e4-401a-81c8-ad4ecec9dac7","Type":"ContainerDied","Data":"d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe"} Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.511702 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4sz2l" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.511715 4834 scope.go:117] "RemoveContainer" containerID="d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.511703 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4sz2l" event={"ID":"768fba83-c2e4-401a-81c8-ad4ecec9dac7","Type":"ContainerDied","Data":"d2e5e5fbbb2a844194506a70ff916d9b05905ce9a106e138f4ec380d16817e38"} Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.526642 4834 scope.go:117] "RemoveContainer" containerID="da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.564851 4834 scope.go:117] "RemoveContainer" containerID="7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.570818 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4sz2l"] Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.573828 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4sz2l"] Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.584501 4834 scope.go:117] "RemoveContainer" containerID="d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe" Jan 30 21:19:11 crc kubenswrapper[4834]: E0130 21:19:11.584832 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe\": container with ID starting with d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe not found: ID does not exist" containerID="d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.584863 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe"} err="failed to get container status \"d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe\": rpc error: code = NotFound desc = could not find container \"d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe\": container with ID starting with d099bb75f2cb3b9853d42489165b023890e2b340e9453e452ee45f5b85e0aebe not found: ID does not exist" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.584909 4834 scope.go:117] "RemoveContainer" containerID="da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9" Jan 30 21:19:11 crc kubenswrapper[4834]: E0130 21:19:11.585274 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9\": container with ID starting with da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9 not found: ID does not exist" containerID="da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.585303 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9"} err="failed to get container status \"da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9\": rpc error: code = NotFound desc = could not find container \"da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9\": container with ID starting with da183d7a68b065a42d36933ddad8a1ad3719c59c3355f9cf773ea9a579a94fa9 not found: ID does not exist" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.585325 4834 scope.go:117] "RemoveContainer" containerID="7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a" Jan 30 21:19:11 crc kubenswrapper[4834]: E0130 21:19:11.585822 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a\": container with ID starting with 7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a not found: ID does not exist" containerID="7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a" Jan 30 21:19:11 crc kubenswrapper[4834]: I0130 21:19:11.585877 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a"} err="failed to get container status \"7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a\": rpc error: code = NotFound desc = could not find container \"7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a\": container with ID starting with 7cb596d5bfa12a1c4774f6da9ab460f258707bfcb8e4cb8395622318557be98a not found: ID does not exist" Jan 30 21:19:12 crc kubenswrapper[4834]: I0130 21:19:12.525275 4834 generic.go:334] "Generic (PLEG): container finished" podID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerID="d15eeb0972dde61a8fc29b96c45d5b8ec3cb738090417ded31ba59a33d8b43ba" exitCode=0 Jan 30 21:19:12 crc kubenswrapper[4834]: I0130 21:19:12.525312 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpwnw" event={"ID":"864b4107-52a4-4db4-a6f7-ca80d4122d26","Type":"ContainerDied","Data":"d15eeb0972dde61a8fc29b96c45d5b8ec3cb738090417ded31ba59a33d8b43ba"} Jan 30 21:19:12 crc kubenswrapper[4834]: I0130 21:19:12.946884 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4lpsm"] Jan 30 21:19:12 crc kubenswrapper[4834]: I0130 21:19:12.947371 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4lpsm" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="registry-server" containerID="cri-o://ba62723a9beee86abecd60454bc47353529b036465cda5a4dd9c834e5882f07d" gracePeriod=2 Jan 30 21:19:13 crc kubenswrapper[4834]: I0130 21:19:13.543101 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" path="/var/lib/kubelet/pods/768fba83-c2e4-401a-81c8-ad4ecec9dac7/volumes" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.541672 4834 generic.go:334] "Generic (PLEG): container finished" podID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerID="ba62723a9beee86abecd60454bc47353529b036465cda5a4dd9c834e5882f07d" exitCode=0 Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.541789 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lpsm" event={"ID":"cd80c77f-7d28-4be9-ac43-d8b7e09872e3","Type":"ContainerDied","Data":"ba62723a9beee86abecd60454bc47353529b036465cda5a4dd9c834e5882f07d"} Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.793518 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.889287 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-catalog-content\") pod \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.889368 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7gxb\" (UniqueName: \"kubernetes.io/projected/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-kube-api-access-w7gxb\") pod \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.889444 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-utilities\") pod \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\" (UID: \"cd80c77f-7d28-4be9-ac43-d8b7e09872e3\") " Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.890169 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-utilities" (OuterVolumeSpecName: "utilities") pod "cd80c77f-7d28-4be9-ac43-d8b7e09872e3" (UID: "cd80c77f-7d28-4be9-ac43-d8b7e09872e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.896547 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-kube-api-access-w7gxb" (OuterVolumeSpecName: "kube-api-access-w7gxb") pod "cd80c77f-7d28-4be9-ac43-d8b7e09872e3" (UID: "cd80c77f-7d28-4be9-ac43-d8b7e09872e3"). InnerVolumeSpecName "kube-api-access-w7gxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.945578 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd80c77f-7d28-4be9-ac43-d8b7e09872e3" (UID: "cd80c77f-7d28-4be9-ac43-d8b7e09872e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.991034 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.991089 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7gxb\" (UniqueName: \"kubernetes.io/projected/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-kube-api-access-w7gxb\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:14 crc kubenswrapper[4834]: I0130 21:19:14.991108 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd80c77f-7d28-4be9-ac43-d8b7e09872e3-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:15 crc kubenswrapper[4834]: I0130 21:19:15.580295 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4lpsm" event={"ID":"cd80c77f-7d28-4be9-ac43-d8b7e09872e3","Type":"ContainerDied","Data":"03aaae38ae215aa94881f4daeef3cec3868f8d56f6738ee16890d0455b1e6556"} Jan 30 21:19:15 crc kubenswrapper[4834]: I0130 21:19:15.580379 4834 scope.go:117] "RemoveContainer" containerID="ba62723a9beee86abecd60454bc47353529b036465cda5a4dd9c834e5882f07d" Jan 30 21:19:15 crc kubenswrapper[4834]: I0130 21:19:15.580658 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4lpsm" Jan 30 21:19:15 crc kubenswrapper[4834]: I0130 21:19:15.611943 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4lpsm"] Jan 30 21:19:15 crc kubenswrapper[4834]: I0130 21:19:15.617938 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4lpsm"] Jan 30 21:19:16 crc kubenswrapper[4834]: I0130 21:19:16.280357 4834 scope.go:117] "RemoveContainer" containerID="3ce00cca1d615c709d35d35a2854f3f21a375e248268d59229463bc354fdd2ff" Jan 30 21:19:16 crc kubenswrapper[4834]: I0130 21:19:16.955922 4834 scope.go:117] "RemoveContainer" containerID="acd37b106904830e3568d9a76dbdcb05796535b9e44f91dc6e62cf6018d23056" Jan 30 21:19:17 crc kubenswrapper[4834]: I0130 21:19:17.546136 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" path="/var/lib/kubelet/pods/cd80c77f-7d28-4be9-ac43-d8b7e09872e3/volumes" Jan 30 21:19:18 crc kubenswrapper[4834]: I0130 21:19:18.604969 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgnmx" event={"ID":"54d43e08-9a17-464b-8808-5bce5d3502d0","Type":"ContainerStarted","Data":"1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f"} Jan 30 21:19:18 crc kubenswrapper[4834]: I0130 21:19:18.635077 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hgnmx" podStartSLOduration=3.684040866 podStartE2EDuration="57.635046083s" podCreationTimestamp="2026-01-30 21:18:21 +0000 UTC" firstStartedPulling="2026-01-30 21:18:23.005274078 +0000 UTC m=+154.158420206" lastFinishedPulling="2026-01-30 21:19:16.956279235 +0000 UTC m=+208.109425423" observedRunningTime="2026-01-30 21:19:18.633486187 +0000 UTC m=+209.786632345" watchObservedRunningTime="2026-01-30 21:19:18.635046083 +0000 UTC m=+209.788192271" Jan 30 21:19:19 crc kubenswrapper[4834]: I0130 21:19:19.614517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwzrw" event={"ID":"ae936d72-5b36-46cb-a845-714057922b0e","Type":"ContainerStarted","Data":"0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf"} Jan 30 21:19:19 crc kubenswrapper[4834]: I0130 21:19:19.617637 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpwnw" event={"ID":"864b4107-52a4-4db4-a6f7-ca80d4122d26","Type":"ContainerStarted","Data":"2eab9a1aeb1b44fcd4c435f34581bd7f8f7af74ef1d26c5664001c9ce0032f2e"} Jan 30 21:19:19 crc kubenswrapper[4834]: I0130 21:19:19.633913 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwzrw" podStartSLOduration=3.604178201 podStartE2EDuration="59.633892086s" podCreationTimestamp="2026-01-30 21:18:20 +0000 UTC" firstStartedPulling="2026-01-30 21:18:21.921314843 +0000 UTC m=+153.074460971" lastFinishedPulling="2026-01-30 21:19:17.951028688 +0000 UTC m=+209.104174856" observedRunningTime="2026-01-30 21:19:19.631237628 +0000 UTC m=+210.784383786" watchObservedRunningTime="2026-01-30 21:19:19.633892086 +0000 UTC m=+210.787038234" Jan 30 21:19:19 crc kubenswrapper[4834]: I0130 21:19:19.651326 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xpwnw" podStartSLOduration=3.372590568 podStartE2EDuration="58.651307849s" podCreationTimestamp="2026-01-30 21:18:21 +0000 UTC" firstStartedPulling="2026-01-30 21:18:22.970094654 +0000 UTC m=+154.123240792" lastFinishedPulling="2026-01-30 21:19:18.248811915 +0000 UTC m=+209.401958073" observedRunningTime="2026-01-30 21:19:19.647681362 +0000 UTC m=+210.800827510" watchObservedRunningTime="2026-01-30 21:19:19.651307849 +0000 UTC m=+210.804453997" Jan 30 21:19:20 crc kubenswrapper[4834]: I0130 21:19:20.910633 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:19:20 crc kubenswrapper[4834]: I0130 21:19:20.912137 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:19:20 crc kubenswrapper[4834]: I0130 21:19:20.958215 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:19:21 crc kubenswrapper[4834]: I0130 21:19:21.548942 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:19:21 crc kubenswrapper[4834]: I0130 21:19:21.549021 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:19:21 crc kubenswrapper[4834]: I0130 21:19:21.917310 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:19:21 crc kubenswrapper[4834]: I0130 21:19:21.917573 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:19:22 crc kubenswrapper[4834]: I0130 21:19:22.606014 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xpwnw" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="registry-server" probeResult="failure" output=< Jan 30 21:19:22 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:19:22 crc kubenswrapper[4834]: > Jan 30 21:19:22 crc kubenswrapper[4834]: I0130 21:19:22.959355 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hgnmx" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="registry-server" probeResult="failure" output=< Jan 30 21:19:22 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:19:22 crc kubenswrapper[4834]: > Jan 30 21:19:30 crc kubenswrapper[4834]: I0130 21:19:30.371165 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xvvtx"] Jan 30 21:19:30 crc kubenswrapper[4834]: I0130 21:19:30.980746 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:19:31 crc kubenswrapper[4834]: I0130 21:19:31.034521 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwzrw"] Jan 30 21:19:31 crc kubenswrapper[4834]: I0130 21:19:31.639455 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:19:31 crc kubenswrapper[4834]: I0130 21:19:31.688344 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwzrw" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="registry-server" containerID="cri-o://0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf" gracePeriod=2 Jan 30 21:19:31 crc kubenswrapper[4834]: I0130 21:19:31.716799 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:19:31 crc kubenswrapper[4834]: I0130 21:19:31.987569 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.055495 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.616653 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgnmx"] Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.647009 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.735798 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae936d72-5b36-46cb-a845-714057922b0e" containerID="0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf" exitCode=0 Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.735948 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwzrw" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.735989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwzrw" event={"ID":"ae936d72-5b36-46cb-a845-714057922b0e","Type":"ContainerDied","Data":"0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf"} Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.736025 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwzrw" event={"ID":"ae936d72-5b36-46cb-a845-714057922b0e","Type":"ContainerDied","Data":"78fba0371b180944601c83095db12831fa7592a294b8641f37e821d49a3ce2ce"} Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.736050 4834 scope.go:117] "RemoveContainer" containerID="0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.755755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-utilities\") pod \"ae936d72-5b36-46cb-a845-714057922b0e\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.756283 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwbq\" (UniqueName: \"kubernetes.io/projected/ae936d72-5b36-46cb-a845-714057922b0e-kube-api-access-bqwbq\") pod \"ae936d72-5b36-46cb-a845-714057922b0e\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.756494 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-catalog-content\") pod \"ae936d72-5b36-46cb-a845-714057922b0e\" (UID: \"ae936d72-5b36-46cb-a845-714057922b0e\") " Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.757722 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-utilities" (OuterVolumeSpecName: "utilities") pod "ae936d72-5b36-46cb-a845-714057922b0e" (UID: "ae936d72-5b36-46cb-a845-714057922b0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.761326 4834 scope.go:117] "RemoveContainer" containerID="35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.763562 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae936d72-5b36-46cb-a845-714057922b0e-kube-api-access-bqwbq" (OuterVolumeSpecName: "kube-api-access-bqwbq") pod "ae936d72-5b36-46cb-a845-714057922b0e" (UID: "ae936d72-5b36-46cb-a845-714057922b0e"). InnerVolumeSpecName "kube-api-access-bqwbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.790295 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae936d72-5b36-46cb-a845-714057922b0e" (UID: "ae936d72-5b36-46cb-a845-714057922b0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.791646 4834 scope.go:117] "RemoveContainer" containerID="91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.810192 4834 scope.go:117] "RemoveContainer" containerID="0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf" Jan 30 21:19:32 crc kubenswrapper[4834]: E0130 21:19:32.810790 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf\": container with ID starting with 0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf not found: ID does not exist" containerID="0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.810963 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf"} err="failed to get container status \"0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf\": rpc error: code = NotFound desc = could not find container \"0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf\": container with ID starting with 0e9e61e60b548a3bb6719936b1d4b71da23d20c6952d569bb64d31359b3eb5bf not found: ID does not exist" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.811083 4834 scope.go:117] "RemoveContainer" containerID="35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5" Jan 30 21:19:32 crc kubenswrapper[4834]: E0130 21:19:32.811418 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5\": container with ID starting with 35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5 not found: ID does not exist" containerID="35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.811453 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5"} err="failed to get container status \"35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5\": rpc error: code = NotFound desc = could not find container \"35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5\": container with ID starting with 35885b439bc35c0aa747c56815b1ada36f6d8bb0fbac058da9484e7b63d8d8d5 not found: ID does not exist" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.811472 4834 scope.go:117] "RemoveContainer" containerID="91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e" Jan 30 21:19:32 crc kubenswrapper[4834]: E0130 21:19:32.811691 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e\": container with ID starting with 91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e not found: ID does not exist" containerID="91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.811729 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e"} err="failed to get container status \"91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e\": rpc error: code = NotFound desc = could not find container \"91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e\": container with ID starting with 91fbdc408e8cb7a063426d279eb127cdff0dd21066dcd7957c2f695af9c7a94e not found: ID does not exist" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.858576 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.858624 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae936d72-5b36-46cb-a845-714057922b0e-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:32 crc kubenswrapper[4834]: I0130 21:19:32.858644 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqwbq\" (UniqueName: \"kubernetes.io/projected/ae936d72-5b36-46cb-a845-714057922b0e-kube-api-access-bqwbq\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:33 crc kubenswrapper[4834]: I0130 21:19:33.087004 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwzrw"] Jan 30 21:19:33 crc kubenswrapper[4834]: I0130 21:19:33.093934 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwzrw"] Jan 30 21:19:33 crc kubenswrapper[4834]: I0130 21:19:33.544501 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae936d72-5b36-46cb-a845-714057922b0e" path="/var/lib/kubelet/pods/ae936d72-5b36-46cb-a845-714057922b0e/volumes" Jan 30 21:19:33 crc kubenswrapper[4834]: I0130 21:19:33.743268 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hgnmx" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="registry-server" containerID="cri-o://1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f" gracePeriod=2 Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.161626 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.161690 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.161744 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.162445 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.162511 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243" gracePeriod=600 Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.660540 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.685576 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-catalog-content\") pod \"54d43e08-9a17-464b-8808-5bce5d3502d0\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.685801 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lpbh\" (UniqueName: \"kubernetes.io/projected/54d43e08-9a17-464b-8808-5bce5d3502d0-kube-api-access-8lpbh\") pod \"54d43e08-9a17-464b-8808-5bce5d3502d0\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.685936 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-utilities\") pod \"54d43e08-9a17-464b-8808-5bce5d3502d0\" (UID: \"54d43e08-9a17-464b-8808-5bce5d3502d0\") " Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.686945 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-utilities" (OuterVolumeSpecName: "utilities") pod "54d43e08-9a17-464b-8808-5bce5d3502d0" (UID: "54d43e08-9a17-464b-8808-5bce5d3502d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.694651 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d43e08-9a17-464b-8808-5bce5d3502d0-kube-api-access-8lpbh" (OuterVolumeSpecName: "kube-api-access-8lpbh") pod "54d43e08-9a17-464b-8808-5bce5d3502d0" (UID: "54d43e08-9a17-464b-8808-5bce5d3502d0"). InnerVolumeSpecName "kube-api-access-8lpbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.749888 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243" exitCode=0 Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.750284 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243"} Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.750317 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"42c330620a3e86a82c2bb84c857d3ae702f97694fd939fb37ee985cfe42ce65b"} Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.755694 4834 generic.go:334] "Generic (PLEG): container finished" podID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerID="1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f" exitCode=0 Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.755739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgnmx" event={"ID":"54d43e08-9a17-464b-8808-5bce5d3502d0","Type":"ContainerDied","Data":"1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f"} Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.755771 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hgnmx" event={"ID":"54d43e08-9a17-464b-8808-5bce5d3502d0","Type":"ContainerDied","Data":"926a7882bf58136c3cf604f80ead2818bf8a5495e694665dee16ae83a0abca51"} Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.755791 4834 scope.go:117] "RemoveContainer" containerID="1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.755897 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hgnmx" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.785375 4834 scope.go:117] "RemoveContainer" containerID="8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.787542 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lpbh\" (UniqueName: \"kubernetes.io/projected/54d43e08-9a17-464b-8808-5bce5d3502d0-kube-api-access-8lpbh\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.787574 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.807990 4834 scope.go:117] "RemoveContainer" containerID="953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.825337 4834 scope.go:117] "RemoveContainer" containerID="1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f" Jan 30 21:19:34 crc kubenswrapper[4834]: E0130 21:19:34.826056 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f\": container with ID starting with 1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f not found: ID does not exist" containerID="1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.826109 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f"} err="failed to get container status \"1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f\": rpc error: code = NotFound desc = could not find container \"1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f\": container with ID starting with 1a3a03fde6551177de2b14628d29c49f9c5db239903bda91f1f83ac4105d6d4f not found: ID does not exist" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.826155 4834 scope.go:117] "RemoveContainer" containerID="8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750" Jan 30 21:19:34 crc kubenswrapper[4834]: E0130 21:19:34.826598 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750\": container with ID starting with 8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750 not found: ID does not exist" containerID="8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.826665 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750"} err="failed to get container status \"8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750\": rpc error: code = NotFound desc = could not find container \"8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750\": container with ID starting with 8b8f77b68a11642c75d9b3ae7aa0c6441dcd7b1005b52f1c9ca463eeab056750 not found: ID does not exist" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.826691 4834 scope.go:117] "RemoveContainer" containerID="953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450" Jan 30 21:19:34 crc kubenswrapper[4834]: E0130 21:19:34.827566 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450\": container with ID starting with 953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450 not found: ID does not exist" containerID="953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.827971 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450"} err="failed to get container status \"953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450\": rpc error: code = NotFound desc = could not find container \"953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450\": container with ID starting with 953239aa37927816c85b8acb7f3353ae4fa6476ede3d0e1e0d4c1a3becac7450 not found: ID does not exist" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.846068 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54d43e08-9a17-464b-8808-5bce5d3502d0" (UID: "54d43e08-9a17-464b-8808-5bce5d3502d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:19:34 crc kubenswrapper[4834]: I0130 21:19:34.889182 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d43e08-9a17-464b-8808-5bce5d3502d0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:35 crc kubenswrapper[4834]: I0130 21:19:35.100233 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hgnmx"] Jan 30 21:19:35 crc kubenswrapper[4834]: I0130 21:19:35.105573 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hgnmx"] Jan 30 21:19:35 crc kubenswrapper[4834]: I0130 21:19:35.544791 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" path="/var/lib/kubelet/pods/54d43e08-9a17-464b-8808-5bce5d3502d0/volumes" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.934054 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935009 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935025 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935041 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935049 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935064 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935073 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935089 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935098 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935111 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935121 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935132 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935142 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935154 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935163 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935177 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935188 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="extract-utilities" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935200 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935211 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="extract-content" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935221 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935228 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935241 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935248 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.935262 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935270 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935447 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd80c77f-7d28-4be9-ac43-d8b7e09872e3" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935461 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d43e08-9a17-464b-8808-5bce5d3502d0" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935477 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="768fba83-c2e4-401a-81c8-ad4ecec9dac7" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935489 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae936d72-5b36-46cb-a845-714057922b0e" containerName="registry-server" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.935865 4834 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.936032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.936166 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff" gracePeriod=15 Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.936250 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2" gracePeriod=15 Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.936315 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e" gracePeriod=15 Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.936287 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145" gracePeriod=15 Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.936299 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70" gracePeriod=15 Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938597 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.938829 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938842 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.938851 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938859 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.938893 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938902 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.938915 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938922 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.938936 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938943 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.938969 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.938978 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939106 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939122 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939138 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939158 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939169 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939178 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:19:45 crc kubenswrapper[4834]: E0130 21:19:45.939290 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:19:45 crc kubenswrapper[4834]: I0130 21:19:45.939300 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056306 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056376 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056436 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056464 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056507 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056550 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.056595 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157314 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157535 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157789 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157858 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157915 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157947 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.157951 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158025 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158077 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158027 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158172 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158232 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158314 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158519 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.158559 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.836306 4834 generic.go:334] "Generic (PLEG): container finished" podID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" containerID="b7e7cec5cc85c855cf47812d31428bbc14a8029c3603805f5f42fcfa46620cf5" exitCode=0 Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.836893 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b","Type":"ContainerDied","Data":"b7e7cec5cc85c855cf47812d31428bbc14a8029c3603805f5f42fcfa46620cf5"} Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.839227 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.839792 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.839925 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.842130 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.843181 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145" exitCode=0 Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.843217 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2" exitCode=0 Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.843232 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70" exitCode=0 Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.843250 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e" exitCode=2 Jan 30 21:19:46 crc kubenswrapper[4834]: I0130 21:19:46.843310 4834 scope.go:117] "RemoveContainer" containerID="170f67d12d3c071ae81a7bbdd12fbe4696c17d303329d3dcaffb1bea22406a39" Jan 30 21:19:47 crc kubenswrapper[4834]: I0130 21:19:47.290516 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 30 21:19:47 crc kubenswrapper[4834]: I0130 21:19:47.290599 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 30 21:19:47 crc kubenswrapper[4834]: I0130 21:19:47.855513 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.339500 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.340446 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.347550 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.348305 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.348678 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.349064 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.389915 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390001 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-var-lock\") pod \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390051 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kubelet-dir\") pod \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390091 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390111 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kube-api-access\") pod \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\" (UID: \"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b\") " Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390218 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" (UID: "9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390268 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390220 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-var-lock" (OuterVolumeSpecName: "var-lock") pod "9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" (UID: "9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390310 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390345 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390771 4834 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390806 4834 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390831 4834 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390855 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.390881 4834 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.399704 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" (UID: "9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.491869 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.867515 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.867497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b","Type":"ContainerDied","Data":"bda3294a0422731835d643521e55d982afc868663145da3d27a94dc32bd10168"} Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.868129 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bda3294a0422731835d643521e55d982afc868663145da3d27a94dc32bd10168" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.872531 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.873751 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff" exitCode=0 Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.873857 4834 scope.go:117] "RemoveContainer" containerID="01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.873932 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.893201 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.893791 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.897176 4834 scope.go:117] "RemoveContainer" containerID="07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.898284 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.898759 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.916346 4834 scope.go:117] "RemoveContainer" containerID="1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.949058 4834 scope.go:117] "RemoveContainer" containerID="2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e" Jan 30 21:19:48 crc kubenswrapper[4834]: I0130 21:19:48.971242 4834 scope.go:117] "RemoveContainer" containerID="f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.000422 4834 scope.go:117] "RemoveContainer" containerID="2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.025511 4834 scope.go:117] "RemoveContainer" containerID="01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.026006 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\": container with ID starting with 01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145 not found: ID does not exist" containerID="01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.026052 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145"} err="failed to get container status \"01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\": rpc error: code = NotFound desc = could not find container \"01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145\": container with ID starting with 01319ba9097cd954b5d85d4c1477ba34bc7b5df0e722f195d3f483755dea5145 not found: ID does not exist" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.026077 4834 scope.go:117] "RemoveContainer" containerID="07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.026466 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\": container with ID starting with 07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2 not found: ID does not exist" containerID="07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.026508 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2"} err="failed to get container status \"07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\": rpc error: code = NotFound desc = could not find container \"07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2\": container with ID starting with 07cfccb4cb04d18c3f58adbcba75e40c657e7d28be6f8729c193635f764f1ac2 not found: ID does not exist" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.026530 4834 scope.go:117] "RemoveContainer" containerID="1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.026747 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\": container with ID starting with 1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70 not found: ID does not exist" containerID="1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.026768 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70"} err="failed to get container status \"1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\": rpc error: code = NotFound desc = could not find container \"1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70\": container with ID starting with 1bfb57980b2b89b55b76a8b91269a5e4cbb0d75140e7750f797e70a18137cf70 not found: ID does not exist" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.026783 4834 scope.go:117] "RemoveContainer" containerID="2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.027019 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\": container with ID starting with 2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e not found: ID does not exist" containerID="2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.027039 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e"} err="failed to get container status \"2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\": rpc error: code = NotFound desc = could not find container \"2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e\": container with ID starting with 2b9257135ad544cf887c161ab3c123c98ffbb86f4a5e699063e6f272c826095e not found: ID does not exist" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.027056 4834 scope.go:117] "RemoveContainer" containerID="f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.027270 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\": container with ID starting with f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff not found: ID does not exist" containerID="f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.027292 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff"} err="failed to get container status \"f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\": rpc error: code = NotFound desc = could not find container \"f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff\": container with ID starting with f58ff277814169c9ad260eae3f0ffb12cfd77466855c79e8dbee1d7651ccefff not found: ID does not exist" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.027309 4834 scope.go:117] "RemoveContainer" containerID="2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.027601 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\": container with ID starting with 2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365 not found: ID does not exist" containerID="2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.027632 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365"} err="failed to get container status \"2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\": rpc error: code = NotFound desc = could not find container \"2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365\": container with ID starting with 2a8177da4239316d9bcd810effb6099a3c05b852ada54f96232674bb22a5e365 not found: ID does not exist" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.208652 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.209140 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.209461 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.209781 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.210181 4834 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.210226 4834 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.210536 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="200ms" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.552854 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="400ms" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.567503 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.567862 4834 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:49 crc kubenswrapper[4834]: I0130 21:19:49.577047 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 30 21:19:49 crc kubenswrapper[4834]: E0130 21:19:49.954555 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="800ms" Jan 30 21:19:50 crc kubenswrapper[4834]: E0130 21:19:50.756036 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="1.6s" Jan 30 21:19:50 crc kubenswrapper[4834]: E0130 21:19:50.985108 4834 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:50 crc kubenswrapper[4834]: I0130 21:19:50.986133 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:51 crc kubenswrapper[4834]: W0130 21:19:51.017288 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-186455978e39625c3b72796d2b73c8182a0e30d0f7f2d0fc33b1761348c3136c WatchSource:0}: Error finding container 186455978e39625c3b72796d2b73c8182a0e30d0f7f2d0fc33b1761348c3136c: Status 404 returned error can't find the container with id 186455978e39625c3b72796d2b73c8182a0e30d0f7f2d0fc33b1761348c3136c Jan 30 21:19:51 crc kubenswrapper[4834]: E0130 21:19:51.021665 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f9efdb1730a01 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:19:51.020915201 +0000 UTC m=+242.174061379,LastTimestamp:2026-01-30 21:19:51.020915201 +0000 UTC m=+242.174061379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:19:51 crc kubenswrapper[4834]: I0130 21:19:51.901001 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4a934c4d44b1ea41441e2a06a4ac97385b4a62531776980fcd84870b04d9462b"} Jan 30 21:19:51 crc kubenswrapper[4834]: I0130 21:19:51.901591 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"186455978e39625c3b72796d2b73c8182a0e30d0f7f2d0fc33b1761348c3136c"} Jan 30 21:19:51 crc kubenswrapper[4834]: E0130 21:19:51.902366 4834 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:19:51 crc kubenswrapper[4834]: I0130 21:19:51.902668 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:52 crc kubenswrapper[4834]: E0130 21:19:52.357748 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="3.2s" Jan 30 21:19:55 crc kubenswrapper[4834]: E0130 21:19:55.128891 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.106:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188f9efdb1730a01 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-30 21:19:51.020915201 +0000 UTC m=+242.174061379,LastTimestamp:2026-01-30 21:19:51.020915201 +0000 UTC m=+242.174061379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.403069 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" containerName="oauth-openshift" containerID="cri-o://596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3" gracePeriod=15 Jan 30 21:19:55 crc kubenswrapper[4834]: E0130 21:19:55.558940 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.106:6443: connect: connection refused" interval="6.4s" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.887446 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.888098 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.888779 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.925777 4834 generic.go:334] "Generic (PLEG): container finished" podID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" containerID="596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3" exitCode=0 Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.925831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" event={"ID":"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68","Type":"ContainerDied","Data":"596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3"} Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.925863 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" event={"ID":"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68","Type":"ContainerDied","Data":"977ed8e7fcff9be07983124dcb4830f2e76bf2ea12a4603f9bebdd0980cba2a2"} Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.925882 4834 scope.go:117] "RemoveContainer" containerID="596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.925883 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.926593 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.927171 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.948674 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-provider-selection\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.948934 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-login\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949035 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tllls\" (UniqueName: \"kubernetes.io/projected/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-kube-api-access-tllls\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949127 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-dir\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-idp-0-file-data\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949345 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-router-certs\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949454 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-serving-cert\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949571 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949956 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-ocp-branding-template\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.950073 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-trusted-ca-bundle\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.950171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.950271 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-error\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.950366 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-policies\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.950495 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-cliconfig\") pod \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\" (UID: \"719a9eb0-8eb3-4fe7-888a-a1e9a426ed68\") " Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.949231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.951330 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.951342 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.951995 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.952637 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.955987 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.956280 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-kube-api-access-tllls" (OuterVolumeSpecName: "kube-api-access-tllls") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "kube-api-access-tllls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.958997 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.959252 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.964717 4834 scope.go:117] "RemoveContainer" containerID="596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3" Jan 30 21:19:55 crc kubenswrapper[4834]: E0130 21:19:55.965300 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3\": container with ID starting with 596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3 not found: ID does not exist" containerID="596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.965371 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3"} err="failed to get container status \"596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3\": rpc error: code = NotFound desc = could not find container \"596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3\": container with ID starting with 596fc905ed77a36fd9b3024e21a7369c366b2f216227e906bdb27cc4f602a2c3 not found: ID does not exist" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.966575 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.967048 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.967262 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.967498 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:55 crc kubenswrapper[4834]: I0130 21:19:55.967746 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" (UID: "719a9eb0-8eb3-4fe7-888a-a1e9a426ed68"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.052281 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.052682 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.052820 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tllls\" (UniqueName: \"kubernetes.io/projected/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-kube-api-access-tllls\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.052956 4834 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053075 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053192 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053308 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053470 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053595 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053729 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053847 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.053970 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.054094 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.054218 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.254486 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:56 crc kubenswrapper[4834]: I0130 21:19:56.255026 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.690269 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.690659 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.952625 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.952711 4834 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1" exitCode=1 Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.952755 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1"} Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.953378 4834 scope.go:117] "RemoveContainer" containerID="172c497ae94c8003e273d48ffa010522acd443d2c5d6c6598473d88bacc830f1" Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.953980 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.954644 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:58 crc kubenswrapper[4834]: I0130 21:19:58.955435 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.535851 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.538613 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.539155 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.968052 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.968147 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c63d3384654b94c358db2cd2ffb6e3b3dc6659f8e27ef577a2269791f2470ab"} Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.969247 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.969715 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:19:59 crc kubenswrapper[4834]: I0130 21:19:59.970302 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.531805 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.533511 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.534633 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.535304 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.557971 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.558025 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:00 crc kubenswrapper[4834]: E0130 21:20:00.558641 4834 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.559291 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.680164 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.978535 4834 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="29024bcd552faa64c7d4d006017f7d32bb3e861f0489cf6b5d2efe8e4642edf3" exitCode=0 Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.978671 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"29024bcd552faa64c7d4d006017f7d32bb3e861f0489cf6b5d2efe8e4642edf3"} Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.978748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c1a11026d1e2bbbd9888f1003649388be0e85a6a2f5b8106b838739f63e0d734"} Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.979443 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.979494 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.979822 4834 status_manager.go:851] "Failed to get status for pod" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:00 crc kubenswrapper[4834]: E0130 21:20:00.979994 4834 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.980646 4834 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:00 crc kubenswrapper[4834]: I0130 21:20:00.981109 4834 status_manager.go:851] "Failed to get status for pod" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" pod="openshift-authentication/oauth-openshift-558db77b4-xvvtx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-xvvtx\": dial tcp 38.102.83.106:6443: connect: connection refused" Jan 30 21:20:01 crc kubenswrapper[4834]: I0130 21:20:01.460132 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:20:01 crc kubenswrapper[4834]: I0130 21:20:01.468234 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:20:02 crc kubenswrapper[4834]: I0130 21:20:02.002568 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"33ef7fed47c35bfd16a990357c5bfd520d20b70c4fb843ab7498d1b9aed8d892"} Jan 30 21:20:02 crc kubenswrapper[4834]: I0130 21:20:02.002667 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f14c1cc49d89bf84c2aa8addbe91e640f8193f8ac8cf40a26b41970414deb0f8"} Jan 30 21:20:02 crc kubenswrapper[4834]: I0130 21:20:02.002687 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4cc400a4d1f007796772dfd72b3094a2c707f750d28a54b6297a21e3d6976661"} Jan 30 21:20:03 crc kubenswrapper[4834]: I0130 21:20:03.010140 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cf5a1132c0357385c9cf279fdfa9cb22655b3d416f03823dcce04aaf568c8472"} Jan 30 21:20:03 crc kubenswrapper[4834]: I0130 21:20:03.010407 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5fccb42fcc3c6fa72125438c94bfb8f2854014bbefa2fab0a517cc4725d8adcf"} Jan 30 21:20:03 crc kubenswrapper[4834]: I0130 21:20:03.010553 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:03 crc kubenswrapper[4834]: I0130 21:20:03.010566 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:05 crc kubenswrapper[4834]: I0130 21:20:05.560148 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:05 crc kubenswrapper[4834]: I0130 21:20:05.560803 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:05 crc kubenswrapper[4834]: I0130 21:20:05.568999 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:08 crc kubenswrapper[4834]: I0130 21:20:08.024899 4834 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:09 crc kubenswrapper[4834]: I0130 21:20:09.041424 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:09 crc kubenswrapper[4834]: I0130 21:20:09.041474 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:09 crc kubenswrapper[4834]: I0130 21:20:09.041911 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:09 crc kubenswrapper[4834]: I0130 21:20:09.047219 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:09 crc kubenswrapper[4834]: I0130 21:20:09.559031 4834 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f9214c4b-d455-43d2-af98-96a5f4be2f6c" Jan 30 21:20:10 crc kubenswrapper[4834]: I0130 21:20:10.047641 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:10 crc kubenswrapper[4834]: I0130 21:20:10.047693 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:10 crc kubenswrapper[4834]: I0130 21:20:10.051593 4834 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f9214c4b-d455-43d2-af98-96a5f4be2f6c" Jan 30 21:20:10 crc kubenswrapper[4834]: I0130 21:20:10.688710 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 30 21:20:11 crc kubenswrapper[4834]: I0130 21:20:11.053973 4834 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:11 crc kubenswrapper[4834]: I0130 21:20:11.054017 4834 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ccb7af7f-dc56-4806-be9c-cce94d47c10e" Jan 30 21:20:11 crc kubenswrapper[4834]: I0130 21:20:11.057851 4834 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f9214c4b-d455-43d2-af98-96a5f4be2f6c" Jan 30 21:20:18 crc kubenswrapper[4834]: I0130 21:20:18.033461 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 30 21:20:18 crc kubenswrapper[4834]: I0130 21:20:18.512808 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 30 21:20:18 crc kubenswrapper[4834]: I0130 21:20:18.903209 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 30 21:20:18 crc kubenswrapper[4834]: I0130 21:20:18.979324 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:20:18 crc kubenswrapper[4834]: I0130 21:20:18.990521 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 30 21:20:19 crc kubenswrapper[4834]: I0130 21:20:19.176844 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 30 21:20:19 crc kubenswrapper[4834]: I0130 21:20:19.328772 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 30 21:20:19 crc kubenswrapper[4834]: I0130 21:20:19.571158 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 30 21:20:19 crc kubenswrapper[4834]: I0130 21:20:19.928463 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.124886 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.125505 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.187069 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.330072 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.436175 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.761751 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.937223 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.959007 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:20:20 crc kubenswrapper[4834]: I0130 21:20:20.997886 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.091965 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.115824 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.139334 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.208486 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.229264 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.339885 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.382235 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.422883 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.451995 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.483459 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.541381 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.641181 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.641339 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.666492 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.907293 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 30 21:20:21 crc kubenswrapper[4834]: I0130 21:20:21.961460 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.014745 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.033687 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.142165 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.151108 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.169759 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.244159 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.274013 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.397823 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.533916 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.561716 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.637290 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.653308 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.773836 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.792346 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.905712 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.934653 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 21:20:22 crc kubenswrapper[4834]: I0130 21:20:22.962155 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.000367 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.037531 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.040770 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.194969 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.259467 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.261239 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.330605 4834 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.373891 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.390682 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.400984 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.414009 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.421915 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.422882 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.448315 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.451060 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.491239 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.678750 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.724911 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.744626 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.757568 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.894377 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.931104 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.946844 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.966229 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 30 21:20:23 crc kubenswrapper[4834]: I0130 21:20:23.993059 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.094576 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.158259 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.208643 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.224926 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.273640 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.276369 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.296765 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.306172 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.318009 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.331581 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.345921 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.371830 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.484448 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.510671 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.584137 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.633907 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.713444 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.720690 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.735327 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.750456 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.889157 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 30 21:20:24 crc kubenswrapper[4834]: I0130 21:20:24.938435 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.048969 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.061543 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.130814 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.131602 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.131939 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.151440 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.169293 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.299964 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.351957 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.372330 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.372577 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.406445 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.411012 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.412178 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.473364 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.495283 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.593684 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.726588 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.740318 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.745027 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.932825 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 30 21:20:25 crc kubenswrapper[4834]: I0130 21:20:25.950877 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.010921 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.011296 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.113637 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.258259 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.357420 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.453333 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.454463 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.522214 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.525771 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.590207 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.592017 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.601129 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.607016 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.642787 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.704070 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.787246 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.919195 4834 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 30 21:20:26 crc kubenswrapper[4834]: I0130 21:20:26.928133 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.024815 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.087305 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.105773 4834 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.214377 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.223027 4834 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.308417 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.368900 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.383557 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.437602 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.508702 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.517299 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.528652 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.543331 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.591755 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.649765 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.785637 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.819485 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.856725 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.874988 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.917361 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.937985 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 30 21:20:27 crc kubenswrapper[4834]: I0130 21:20:27.946779 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.009855 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.245053 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.254692 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.360335 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.502941 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.522581 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.553909 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.577913 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.591447 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.671285 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.671313 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.734752 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.799470 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 30 21:20:28 crc kubenswrapper[4834]: I0130 21:20:28.995130 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.069174 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.148638 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.265371 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.373531 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.399973 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.401103 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.514180 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.565252 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.655854 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.659984 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.771859 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.782288 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 30 21:20:29 crc kubenswrapper[4834]: I0130 21:20:29.787610 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.002285 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.291318 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.385951 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.453090 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.484970 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.777699 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.793908 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 30 21:20:30 crc kubenswrapper[4834]: I0130 21:20:30.924879 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.091210 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.119675 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.209907 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.218068 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.219834 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.263603 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.335540 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.377938 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.587448 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.661429 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.669573 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.754058 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.769617 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.845039 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.856741 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.862488 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 30 21:20:31 crc kubenswrapper[4834]: I0130 21:20:31.888259 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.106668 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.182296 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.236025 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.476477 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.505266 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.524272 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.595235 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.761355 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.849079 4834 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.856847 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xvvtx","openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.856944 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.862328 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.893641 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.893613285 podStartE2EDuration="24.893613285s" podCreationTimestamp="2026-01-30 21:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:32.887175726 +0000 UTC m=+284.040321924" watchObservedRunningTime="2026-01-30 21:20:32.893613285 +0000 UTC m=+284.046759463" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.910825 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 30 21:20:32 crc kubenswrapper[4834]: I0130 21:20:32.969478 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.116771 4834 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.359794 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.424172 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.542368 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" path="/var/lib/kubelet/pods/719a9eb0-8eb3-4fe7-888a-a1e9a426ed68/volumes" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.570500 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.594314 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.595258 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 30 21:20:33 crc kubenswrapper[4834]: I0130 21:20:33.696881 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 30 21:20:34 crc kubenswrapper[4834]: I0130 21:20:34.068007 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 30 21:20:34 crc kubenswrapper[4834]: I0130 21:20:34.071628 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 30 21:20:34 crc kubenswrapper[4834]: I0130 21:20:34.160239 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 30 21:20:34 crc kubenswrapper[4834]: I0130 21:20:34.187894 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 30 21:20:34 crc kubenswrapper[4834]: I0130 21:20:34.240429 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 30 21:20:34 crc kubenswrapper[4834]: I0130 21:20:34.996198 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 30 21:20:36 crc kubenswrapper[4834]: I0130 21:20:36.017927 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 30 21:20:41 crc kubenswrapper[4834]: I0130 21:20:41.949013 4834 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 30 21:20:41 crc kubenswrapper[4834]: I0130 21:20:41.949664 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://4a934c4d44b1ea41441e2a06a4ac97385b4a62531776980fcd84870b04d9462b" gracePeriod=5 Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.949434 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx"] Jan 30 21:20:42 crc kubenswrapper[4834]: E0130 21:20:42.949977 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.949994 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 21:20:42 crc kubenswrapper[4834]: E0130 21:20:42.950016 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" containerName="installer" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.950028 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" containerName="installer" Jan 30 21:20:42 crc kubenswrapper[4834]: E0130 21:20:42.950048 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" containerName="oauth-openshift" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.950057 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" containerName="oauth-openshift" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.950188 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="719a9eb0-8eb3-4fe7-888a-a1e9a426ed68" containerName="oauth-openshift" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.950202 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.950215 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d25b97c-1b18-4bc6-aa1b-9924b45f6a4b" containerName="installer" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.950829 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.953448 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.957169 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.957895 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.958196 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.960376 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.962219 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.962654 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.962694 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.962969 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.963251 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.963666 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.963968 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.964383 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx"] Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.970178 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.970884 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 30 21:20:42 crc kubenswrapper[4834]: I0130 21:20:42.995967 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065061 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065120 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065336 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67740e54-ace2-4b67-9805-d6a25b2072dd-audit-dir\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065370 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065438 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065678 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065706 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065793 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065824 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqd6\" (UniqueName: \"kubernetes.io/projected/67740e54-ace2-4b67-9805-d6a25b2072dd-kube-api-access-rgqd6\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065845 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-audit-policies\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.065868 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-session\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.166876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.166947 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.166979 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqd6\" (UniqueName: \"kubernetes.io/projected/67740e54-ace2-4b67-9805-d6a25b2072dd-kube-api-access-rgqd6\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167008 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-audit-policies\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167046 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-session\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167155 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67740e54-ace2-4b67-9805-d6a25b2072dd-audit-dir\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167179 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167206 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167254 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167277 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.167302 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.168150 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.168225 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67740e54-ace2-4b67-9805-d6a25b2072dd-audit-dir\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.168974 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.169746 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-audit-policies\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.169855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.178821 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.178973 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.179123 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-session\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.180791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.181601 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.189575 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.196544 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.201137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/67740e54-ace2-4b67-9805-d6a25b2072dd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.202151 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqd6\" (UniqueName: \"kubernetes.io/projected/67740e54-ace2-4b67-9805-d6a25b2072dd-kube-api-access-rgqd6\") pod \"oauth-openshift-5d4b6f47b4-cccxx\" (UID: \"67740e54-ace2-4b67-9805-d6a25b2072dd\") " pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.337780 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:43 crc kubenswrapper[4834]: I0130 21:20:43.626522 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx"] Jan 30 21:20:43 crc kubenswrapper[4834]: W0130 21:20:43.635018 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67740e54_ace2_4b67_9805_d6a25b2072dd.slice/crio-7db8243e503906fa1af9b0da0f8b402cbc0b41b2767c095e80f14cc23d77ed2e WatchSource:0}: Error finding container 7db8243e503906fa1af9b0da0f8b402cbc0b41b2767c095e80f14cc23d77ed2e: Status 404 returned error can't find the container with id 7db8243e503906fa1af9b0da0f8b402cbc0b41b2767c095e80f14cc23d77ed2e Jan 30 21:20:44 crc kubenswrapper[4834]: I0130 21:20:44.270975 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" event={"ID":"67740e54-ace2-4b67-9805-d6a25b2072dd","Type":"ContainerStarted","Data":"e080c09c0f61cd7f2d064f92c0ac629c314918ce37707c98bf1ececed67e9c2e"} Jan 30 21:20:44 crc kubenswrapper[4834]: I0130 21:20:44.275465 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" event={"ID":"67740e54-ace2-4b67-9805-d6a25b2072dd","Type":"ContainerStarted","Data":"7db8243e503906fa1af9b0da0f8b402cbc0b41b2767c095e80f14cc23d77ed2e"} Jan 30 21:20:44 crc kubenswrapper[4834]: I0130 21:20:44.276002 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:44 crc kubenswrapper[4834]: I0130 21:20:44.411184 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" Jan 30 21:20:44 crc kubenswrapper[4834]: I0130 21:20:44.444072 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4b6f47b4-cccxx" podStartSLOduration=74.444025068 podStartE2EDuration="1m14.444025068s" podCreationTimestamp="2026-01-30 21:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:44.31347286 +0000 UTC m=+295.466619038" watchObservedRunningTime="2026-01-30 21:20:44.444025068 +0000 UTC m=+295.597171246" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.294707 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.294768 4834 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="4a934c4d44b1ea41441e2a06a4ac97385b4a62531776980fcd84870b04d9462b" exitCode=137 Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.547764 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.547867 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732063 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732142 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732685 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732716 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732796 4834 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732819 4834 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.732800 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.745423 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.834551 4834 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.834605 4834 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:47 crc kubenswrapper[4834]: I0130 21:20:47.834631 4834 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:48 crc kubenswrapper[4834]: I0130 21:20:48.305729 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 30 21:20:48 crc kubenswrapper[4834]: I0130 21:20:48.305943 4834 scope.go:117] "RemoveContainer" containerID="4a934c4d44b1ea41441e2a06a4ac97385b4a62531776980fcd84870b04d9462b" Jan 30 21:20:48 crc kubenswrapper[4834]: I0130 21:20:48.305990 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 30 21:20:49 crc kubenswrapper[4834]: I0130 21:20:49.316083 4834 generic.go:334] "Generic (PLEG): container finished" podID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerID="f1e3f597d5bb3d4210ce321e536af817f9bb98d856b761ab9d758e1a17d7fc75" exitCode=0 Jan 30 21:20:49 crc kubenswrapper[4834]: I0130 21:20:49.316149 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" event={"ID":"aab9a42d-c833-46b2-a745-1bb95ada7f68","Type":"ContainerDied","Data":"f1e3f597d5bb3d4210ce321e536af817f9bb98d856b761ab9d758e1a17d7fc75"} Jan 30 21:20:49 crc kubenswrapper[4834]: I0130 21:20:49.316828 4834 scope.go:117] "RemoveContainer" containerID="f1e3f597d5bb3d4210ce321e536af817f9bb98d856b761ab9d758e1a17d7fc75" Jan 30 21:20:49 crc kubenswrapper[4834]: I0130 21:20:49.323593 4834 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 30 21:20:49 crc kubenswrapper[4834]: I0130 21:20:49.540386 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 30 21:20:50 crc kubenswrapper[4834]: I0130 21:20:50.324820 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" event={"ID":"aab9a42d-c833-46b2-a745-1bb95ada7f68","Type":"ContainerStarted","Data":"5912761b1d49fe327b7005cc40b8b47231cb943968aca0878ea672b212cdfaed"} Jan 30 21:20:50 crc kubenswrapper[4834]: I0130 21:20:50.325304 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:20:50 crc kubenswrapper[4834]: I0130 21:20:50.327932 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.435795 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zrxr5"] Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.436348 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" podUID="e4abe8be-aa7d-46ad-a658-259955e42044" containerName="controller-manager" containerID="cri-o://5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7" gracePeriod=30 Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.560644 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg"] Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.560820 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" podUID="b085cfec-e382-48c5-a623-679412c5b97e" containerName="route-controller-manager" containerID="cri-o://4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574" gracePeriod=30 Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.791811 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.862815 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-proxy-ca-bundles\") pod \"e4abe8be-aa7d-46ad-a658-259955e42044\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.862884 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4ksx\" (UniqueName: \"kubernetes.io/projected/e4abe8be-aa7d-46ad-a658-259955e42044-kube-api-access-c4ksx\") pod \"e4abe8be-aa7d-46ad-a658-259955e42044\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.862964 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-config\") pod \"e4abe8be-aa7d-46ad-a658-259955e42044\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.863007 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4abe8be-aa7d-46ad-a658-259955e42044-serving-cert\") pod \"e4abe8be-aa7d-46ad-a658-259955e42044\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.863047 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-client-ca\") pod \"e4abe8be-aa7d-46ad-a658-259955e42044\" (UID: \"e4abe8be-aa7d-46ad-a658-259955e42044\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.864234 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4abe8be-aa7d-46ad-a658-259955e42044" (UID: "e4abe8be-aa7d-46ad-a658-259955e42044"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.864252 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4abe8be-aa7d-46ad-a658-259955e42044" (UID: "e4abe8be-aa7d-46ad-a658-259955e42044"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.864334 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-config" (OuterVolumeSpecName: "config") pod "e4abe8be-aa7d-46ad-a658-259955e42044" (UID: "e4abe8be-aa7d-46ad-a658-259955e42044"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.869411 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4abe8be-aa7d-46ad-a658-259955e42044-kube-api-access-c4ksx" (OuterVolumeSpecName: "kube-api-access-c4ksx") pod "e4abe8be-aa7d-46ad-a658-259955e42044" (UID: "e4abe8be-aa7d-46ad-a658-259955e42044"). InnerVolumeSpecName "kube-api-access-c4ksx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.869577 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4abe8be-aa7d-46ad-a658-259955e42044-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4abe8be-aa7d-46ad-a658-259955e42044" (UID: "e4abe8be-aa7d-46ad-a658-259955e42044"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.895227 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.963471 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca\") pod \"b085cfec-e382-48c5-a623-679412c5b97e\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.963522 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d26c2\" (UniqueName: \"kubernetes.io/projected/b085cfec-e382-48c5-a623-679412c5b97e-kube-api-access-d26c2\") pod \"b085cfec-e382-48c5-a623-679412c5b97e\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.963567 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert\") pod \"b085cfec-e382-48c5-a623-679412c5b97e\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.963590 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config\") pod \"b085cfec-e382-48c5-a623-679412c5b97e\" (UID: \"b085cfec-e382-48c5-a623-679412c5b97e\") " Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964093 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964110 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4abe8be-aa7d-46ad-a658-259955e42044-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964118 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964126 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4abe8be-aa7d-46ad-a658-259955e42044-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964136 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4ksx\" (UniqueName: \"kubernetes.io/projected/e4abe8be-aa7d-46ad-a658-259955e42044-kube-api-access-c4ksx\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964466 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca" (OuterVolumeSpecName: "client-ca") pod "b085cfec-e382-48c5-a623-679412c5b97e" (UID: "b085cfec-e382-48c5-a623-679412c5b97e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.964753 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config" (OuterVolumeSpecName: "config") pod "b085cfec-e382-48c5-a623-679412c5b97e" (UID: "b085cfec-e382-48c5-a623-679412c5b97e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.967094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b085cfec-e382-48c5-a623-679412c5b97e-kube-api-access-d26c2" (OuterVolumeSpecName: "kube-api-access-d26c2") pod "b085cfec-e382-48c5-a623-679412c5b97e" (UID: "b085cfec-e382-48c5-a623-679412c5b97e"). InnerVolumeSpecName "kube-api-access-d26c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:20:54 crc kubenswrapper[4834]: I0130 21:20:54.967227 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b085cfec-e382-48c5-a623-679412c5b97e" (UID: "b085cfec-e382-48c5-a623-679412c5b97e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.064934 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.064977 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d26c2\" (UniqueName: \"kubernetes.io/projected/b085cfec-e382-48c5-a623-679412c5b97e-kube-api-access-d26c2\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.064988 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b085cfec-e382-48c5-a623-679412c5b97e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.064996 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b085cfec-e382-48c5-a623-679412c5b97e-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.365220 4834 generic.go:334] "Generic (PLEG): container finished" podID="b085cfec-e382-48c5-a623-679412c5b97e" containerID="4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574" exitCode=0 Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.365353 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.365353 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" event={"ID":"b085cfec-e382-48c5-a623-679412c5b97e","Type":"ContainerDied","Data":"4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574"} Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.365487 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg" event={"ID":"b085cfec-e382-48c5-a623-679412c5b97e","Type":"ContainerDied","Data":"aa5b7a21d1529f620ff06b51a6fbbdbefb35a469a6adb3f77d0e67f17dd01aac"} Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.365535 4834 scope.go:117] "RemoveContainer" containerID="4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.367534 4834 generic.go:334] "Generic (PLEG): container finished" podID="e4abe8be-aa7d-46ad-a658-259955e42044" containerID="5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7" exitCode=0 Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.367562 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.367574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" event={"ID":"e4abe8be-aa7d-46ad-a658-259955e42044","Type":"ContainerDied","Data":"5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7"} Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.367608 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zrxr5" event={"ID":"e4abe8be-aa7d-46ad-a658-259955e42044","Type":"ContainerDied","Data":"a8d089fe33941de8b6ef439c9b223a36b10a46e4c4b10dcffe4b07f584642ab8"} Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.395071 4834 scope.go:117] "RemoveContainer" containerID="4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574" Jan 30 21:20:55 crc kubenswrapper[4834]: E0130 21:20:55.395899 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574\": container with ID starting with 4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574 not found: ID does not exist" containerID="4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.396918 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574"} err="failed to get container status \"4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574\": rpc error: code = NotFound desc = could not find container \"4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574\": container with ID starting with 4ff6a6302929fadd07f0eefba8997af2c2faa629906a0a98ebdc6e9cb869b574 not found: ID does not exist" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.397031 4834 scope.go:117] "RemoveContainer" containerID="5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.416557 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg"] Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.421021 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gndbg"] Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.425833 4834 scope.go:117] "RemoveContainer" containerID="5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7" Jan 30 21:20:55 crc kubenswrapper[4834]: E0130 21:20:55.426538 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7\": container with ID starting with 5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7 not found: ID does not exist" containerID="5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.426584 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7"} err="failed to get container status \"5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7\": rpc error: code = NotFound desc = could not find container \"5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7\": container with ID starting with 5a92a26b0cd1a215e46ede949f313126bb9c27189d14a24838d911e3f21893c7 not found: ID does not exist" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.428456 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zrxr5"] Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.431496 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zrxr5"] Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.539724 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b085cfec-e382-48c5-a623-679412c5b97e" path="/var/lib/kubelet/pods/b085cfec-e382-48c5-a623-679412c5b97e/volumes" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.540625 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4abe8be-aa7d-46ad-a658-259955e42044" path="/var/lib/kubelet/pods/e4abe8be-aa7d-46ad-a658-259955e42044/volumes" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.952636 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-696b475cdf-lnfmg"] Jan 30 21:20:55 crc kubenswrapper[4834]: E0130 21:20:55.953282 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4abe8be-aa7d-46ad-a658-259955e42044" containerName="controller-manager" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.953304 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4abe8be-aa7d-46ad-a658-259955e42044" containerName="controller-manager" Jan 30 21:20:55 crc kubenswrapper[4834]: E0130 21:20:55.953346 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b085cfec-e382-48c5-a623-679412c5b97e" containerName="route-controller-manager" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.953359 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b085cfec-e382-48c5-a623-679412c5b97e" containerName="route-controller-manager" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.953561 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b085cfec-e382-48c5-a623-679412c5b97e" containerName="route-controller-manager" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.953602 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4abe8be-aa7d-46ad-a658-259955e42044" containerName="controller-manager" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.954120 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.955281 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr"] Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.955695 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.959565 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.961975 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962166 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962166 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962363 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962628 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962644 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962842 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962931 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.962940 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.963008 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.963207 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-696b475cdf-lnfmg"] Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.964597 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.969852 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:20:55 crc kubenswrapper[4834]: I0130 21:20:55.975813 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr"] Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075471 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-client-ca\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075531 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-client-ca\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a4731-ca3b-40f4-a474-817f51770a3e-serving-cert\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075607 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-config\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075631 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64406017-7ac3-43b6-8bd5-96c9a9d37d65-serving-cert\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075673 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-proxy-ca-bundles\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075701 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-config\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wklr\" (UniqueName: \"kubernetes.io/projected/64406017-7ac3-43b6-8bd5-96c9a9d37d65-kube-api-access-4wklr\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.075762 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gb9m\" (UniqueName: \"kubernetes.io/projected/a44a4731-ca3b-40f4-a474-817f51770a3e-kube-api-access-6gb9m\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a4731-ca3b-40f4-a474-817f51770a3e-serving-cert\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176723 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-config\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176766 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64406017-7ac3-43b6-8bd5-96c9a9d37d65-serving-cert\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176806 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-proxy-ca-bundles\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176851 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-config\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176887 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wklr\" (UniqueName: \"kubernetes.io/projected/64406017-7ac3-43b6-8bd5-96c9a9d37d65-kube-api-access-4wklr\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.176933 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gb9m\" (UniqueName: \"kubernetes.io/projected/a44a4731-ca3b-40f4-a474-817f51770a3e-kube-api-access-6gb9m\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.177025 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-client-ca\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.177092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-client-ca\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.178135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-proxy-ca-bundles\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.178225 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-config\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.178574 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-client-ca\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.179207 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-config\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.179997 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-client-ca\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.186593 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a4731-ca3b-40f4-a474-817f51770a3e-serving-cert\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.186685 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64406017-7ac3-43b6-8bd5-96c9a9d37d65-serving-cert\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.233189 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wklr\" (UniqueName: \"kubernetes.io/projected/64406017-7ac3-43b6-8bd5-96c9a9d37d65-kube-api-access-4wklr\") pod \"controller-manager-696b475cdf-lnfmg\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.238248 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gb9m\" (UniqueName: \"kubernetes.io/projected/a44a4731-ca3b-40f4-a474-817f51770a3e-kube-api-access-6gb9m\") pod \"route-controller-manager-556c8cbd6d-cdfzr\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.279813 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.295019 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.468434 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-696b475cdf-lnfmg"] Jan 30 21:20:56 crc kubenswrapper[4834]: I0130 21:20:56.732156 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr"] Jan 30 21:20:56 crc kubenswrapper[4834]: W0130 21:20:56.748618 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda44a4731_ca3b_40f4_a474_817f51770a3e.slice/crio-752507ac617859174a816cd422cca25fbf87fa79230ddf803631c20f45b8d451 WatchSource:0}: Error finding container 752507ac617859174a816cd422cca25fbf87fa79230ddf803631c20f45b8d451: Status 404 returned error can't find the container with id 752507ac617859174a816cd422cca25fbf87fa79230ddf803631c20f45b8d451 Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.388990 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" event={"ID":"64406017-7ac3-43b6-8bd5-96c9a9d37d65","Type":"ContainerStarted","Data":"aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3"} Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.389350 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" event={"ID":"64406017-7ac3-43b6-8bd5-96c9a9d37d65","Type":"ContainerStarted","Data":"9da4cec04af097650d07ab67d6f473a943bf133c5b6cd8bf8551570141aae419"} Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.390199 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.393421 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" event={"ID":"a44a4731-ca3b-40f4-a474-817f51770a3e","Type":"ContainerStarted","Data":"a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90"} Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.393483 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" event={"ID":"a44a4731-ca3b-40f4-a474-817f51770a3e","Type":"ContainerStarted","Data":"752507ac617859174a816cd422cca25fbf87fa79230ddf803631c20f45b8d451"} Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.393641 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.395972 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.398988 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.426186 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" podStartSLOduration=3.42617063 podStartE2EDuration="3.42617063s" podCreationTimestamp="2026-01-30 21:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:57.409689343 +0000 UTC m=+308.562835481" watchObservedRunningTime="2026-01-30 21:20:57.42617063 +0000 UTC m=+308.579316768" Jan 30 21:20:57 crc kubenswrapper[4834]: I0130 21:20:57.453716 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" podStartSLOduration=3.453702408 podStartE2EDuration="3.453702408s" podCreationTimestamp="2026-01-30 21:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:20:57.429431671 +0000 UTC m=+308.582577809" watchObservedRunningTime="2026-01-30 21:20:57.453702408 +0000 UTC m=+308.606848546" Jan 30 21:20:59 crc kubenswrapper[4834]: I0130 21:20:59.945163 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-696b475cdf-lnfmg"] Jan 30 21:20:59 crc kubenswrapper[4834]: I0130 21:20:59.959675 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr"] Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.409622 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" podUID="a44a4731-ca3b-40f4-a474-817f51770a3e" containerName="route-controller-manager" containerID="cri-o://a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90" gracePeriod=30 Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.409716 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" podUID="64406017-7ac3-43b6-8bd5-96c9a9d37d65" containerName="controller-manager" containerID="cri-o://aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3" gracePeriod=30 Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.855648 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.868108 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.934787 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-config\") pod \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.934838 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-proxy-ca-bundles\") pod \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.934875 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wklr\" (UniqueName: \"kubernetes.io/projected/64406017-7ac3-43b6-8bd5-96c9a9d37d65-kube-api-access-4wklr\") pod \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.934910 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gb9m\" (UniqueName: \"kubernetes.io/projected/a44a4731-ca3b-40f4-a474-817f51770a3e-kube-api-access-6gb9m\") pod \"a44a4731-ca3b-40f4-a474-817f51770a3e\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.934945 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-config\") pod \"a44a4731-ca3b-40f4-a474-817f51770a3e\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.934976 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-client-ca\") pod \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935004 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a4731-ca3b-40f4-a474-817f51770a3e-serving-cert\") pod \"a44a4731-ca3b-40f4-a474-817f51770a3e\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64406017-7ac3-43b6-8bd5-96c9a9d37d65-serving-cert\") pod \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\" (UID: \"64406017-7ac3-43b6-8bd5-96c9a9d37d65\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935080 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-client-ca\") pod \"a44a4731-ca3b-40f4-a474-817f51770a3e\" (UID: \"a44a4731-ca3b-40f4-a474-817f51770a3e\") " Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935724 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "64406017-7ac3-43b6-8bd5-96c9a9d37d65" (UID: "64406017-7ac3-43b6-8bd5-96c9a9d37d65"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935781 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-client-ca" (OuterVolumeSpecName: "client-ca") pod "64406017-7ac3-43b6-8bd5-96c9a9d37d65" (UID: "64406017-7ac3-43b6-8bd5-96c9a9d37d65"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935829 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-client-ca" (OuterVolumeSpecName: "client-ca") pod "a44a4731-ca3b-40f4-a474-817f51770a3e" (UID: "a44a4731-ca3b-40f4-a474-817f51770a3e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.935819 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-config" (OuterVolumeSpecName: "config") pod "a44a4731-ca3b-40f4-a474-817f51770a3e" (UID: "a44a4731-ca3b-40f4-a474-817f51770a3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.936151 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-config" (OuterVolumeSpecName: "config") pod "64406017-7ac3-43b6-8bd5-96c9a9d37d65" (UID: "64406017-7ac3-43b6-8bd5-96c9a9d37d65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.939977 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64406017-7ac3-43b6-8bd5-96c9a9d37d65-kube-api-access-4wklr" (OuterVolumeSpecName: "kube-api-access-4wklr") pod "64406017-7ac3-43b6-8bd5-96c9a9d37d65" (UID: "64406017-7ac3-43b6-8bd5-96c9a9d37d65"). InnerVolumeSpecName "kube-api-access-4wklr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.939997 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44a4731-ca3b-40f4-a474-817f51770a3e-kube-api-access-6gb9m" (OuterVolumeSpecName: "kube-api-access-6gb9m") pod "a44a4731-ca3b-40f4-a474-817f51770a3e" (UID: "a44a4731-ca3b-40f4-a474-817f51770a3e"). InnerVolumeSpecName "kube-api-access-6gb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.940355 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44a4731-ca3b-40f4-a474-817f51770a3e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a44a4731-ca3b-40f4-a474-817f51770a3e" (UID: "a44a4731-ca3b-40f4-a474-817f51770a3e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:21:00 crc kubenswrapper[4834]: I0130 21:21:00.940507 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64406017-7ac3-43b6-8bd5-96c9a9d37d65-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "64406017-7ac3-43b6-8bd5-96c9a9d37d65" (UID: "64406017-7ac3-43b6-8bd5-96c9a9d37d65"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.036898 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a4731-ca3b-40f4-a474-817f51770a3e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.036945 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64406017-7ac3-43b6-8bd5-96c9a9d37d65-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.036962 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.036979 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.036996 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.037016 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wklr\" (UniqueName: \"kubernetes.io/projected/64406017-7ac3-43b6-8bd5-96c9a9d37d65-kube-api-access-4wklr\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.037032 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gb9m\" (UniqueName: \"kubernetes.io/projected/a44a4731-ca3b-40f4-a474-817f51770a3e-kube-api-access-6gb9m\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.037048 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a4731-ca3b-40f4-a474-817f51770a3e-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.037061 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64406017-7ac3-43b6-8bd5-96c9a9d37d65-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.419743 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.419741 4834 generic.go:334] "Generic (PLEG): container finished" podID="64406017-7ac3-43b6-8bd5-96c9a9d37d65" containerID="aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3" exitCode=0 Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.419778 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" event={"ID":"64406017-7ac3-43b6-8bd5-96c9a9d37d65","Type":"ContainerDied","Data":"aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3"} Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.420416 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-696b475cdf-lnfmg" event={"ID":"64406017-7ac3-43b6-8bd5-96c9a9d37d65","Type":"ContainerDied","Data":"9da4cec04af097650d07ab67d6f473a943bf133c5b6cd8bf8551570141aae419"} Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.420457 4834 scope.go:117] "RemoveContainer" containerID="aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.421960 4834 generic.go:334] "Generic (PLEG): container finished" podID="a44a4731-ca3b-40f4-a474-817f51770a3e" containerID="a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90" exitCode=0 Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.422001 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.421993 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" event={"ID":"a44a4731-ca3b-40f4-a474-817f51770a3e","Type":"ContainerDied","Data":"a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90"} Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.422079 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr" event={"ID":"a44a4731-ca3b-40f4-a474-817f51770a3e","Type":"ContainerDied","Data":"752507ac617859174a816cd422cca25fbf87fa79230ddf803631c20f45b8d451"} Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.441696 4834 scope.go:117] "RemoveContainer" containerID="aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3" Jan 30 21:21:01 crc kubenswrapper[4834]: E0130 21:21:01.442282 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3\": container with ID starting with aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3 not found: ID does not exist" containerID="aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.442350 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3"} err="failed to get container status \"aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3\": rpc error: code = NotFound desc = could not find container \"aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3\": container with ID starting with aaf0b4d5f7ba9220397f0aba89ff4b3e681a3a4a474574bcb4db37f819db28f3 not found: ID does not exist" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.442428 4834 scope.go:117] "RemoveContainer" containerID="a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.465040 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-696b475cdf-lnfmg"] Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.469156 4834 scope.go:117] "RemoveContainer" containerID="a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90" Jan 30 21:21:01 crc kubenswrapper[4834]: E0130 21:21:01.469678 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90\": container with ID starting with a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90 not found: ID does not exist" containerID="a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.469755 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90"} err="failed to get container status \"a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90\": rpc error: code = NotFound desc = could not find container \"a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90\": container with ID starting with a8ec1dc71790dc14278f5b1b564500c1a2cbde34a289ccfedbf1ab3d49751c90 not found: ID does not exist" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.475066 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-696b475cdf-lnfmg"] Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.484530 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr"] Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.486510 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c8cbd6d-cdfzr"] Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.538773 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64406017-7ac3-43b6-8bd5-96c9a9d37d65" path="/var/lib/kubelet/pods/64406017-7ac3-43b6-8bd5-96c9a9d37d65/volumes" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.539870 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44a4731-ca3b-40f4-a474-817f51770a3e" path="/var/lib/kubelet/pods/a44a4731-ca3b-40f4-a474-817f51770a3e/volumes" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.957857 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d486f5f58-5v2bq"] Jan 30 21:21:01 crc kubenswrapper[4834]: E0130 21:21:01.958173 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44a4731-ca3b-40f4-a474-817f51770a3e" containerName="route-controller-manager" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.958191 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44a4731-ca3b-40f4-a474-817f51770a3e" containerName="route-controller-manager" Jan 30 21:21:01 crc kubenswrapper[4834]: E0130 21:21:01.958226 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64406017-7ac3-43b6-8bd5-96c9a9d37d65" containerName="controller-manager" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.958235 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="64406017-7ac3-43b6-8bd5-96c9a9d37d65" containerName="controller-manager" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.958345 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="64406017-7ac3-43b6-8bd5-96c9a9d37d65" containerName="controller-manager" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.958375 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44a4731-ca3b-40f4-a474-817f51770a3e" containerName="route-controller-manager" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.958855 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.962463 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n"] Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.963423 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.964234 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.965307 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.966800 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.966857 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.967336 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.968127 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.968576 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.968613 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.968836 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.976371 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.977024 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.977097 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.980012 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n"] Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.984130 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 30 21:21:01 crc kubenswrapper[4834]: I0130 21:21:01.985124 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d486f5f58-5v2bq"] Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.051231 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-client-ca\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.052347 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b0729e4-75c4-4e4b-a440-fc685fc5958f-serving-cert\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.052691 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-config\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.052937 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-proxy-ca-bundles\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.053157 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvkg\" (UniqueName: \"kubernetes.io/projected/94a66935-e061-419b-991e-a038768b893a-kube-api-access-kqvkg\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.053481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a66935-e061-419b-991e-a038768b893a-serving-cert\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.053696 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-config\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.053911 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-client-ca\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.054115 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpb89\" (UniqueName: \"kubernetes.io/projected/3b0729e4-75c4-4e4b-a440-fc685fc5958f-kube-api-access-xpb89\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.155880 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-proxy-ca-bundles\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.155960 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvkg\" (UniqueName: \"kubernetes.io/projected/94a66935-e061-419b-991e-a038768b893a-kube-api-access-kqvkg\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a66935-e061-419b-991e-a038768b893a-serving-cert\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-config\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156133 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-client-ca\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156167 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpb89\" (UniqueName: \"kubernetes.io/projected/3b0729e4-75c4-4e4b-a440-fc685fc5958f-kube-api-access-xpb89\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156206 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-client-ca\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156242 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b0729e4-75c4-4e4b-a440-fc685fc5958f-serving-cert\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.156273 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-config\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.157563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-client-ca\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.157729 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-config\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.158806 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-proxy-ca-bundles\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.158829 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-client-ca\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.161077 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a66935-e061-419b-991e-a038768b893a-config\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.167324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a66935-e061-419b-991e-a038768b893a-serving-cert\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.176755 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b0729e4-75c4-4e4b-a440-fc685fc5958f-serving-cert\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.183199 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpb89\" (UniqueName: \"kubernetes.io/projected/3b0729e4-75c4-4e4b-a440-fc685fc5958f-kube-api-access-xpb89\") pod \"route-controller-manager-55bb989677-qss4n\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.183685 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvkg\" (UniqueName: \"kubernetes.io/projected/94a66935-e061-419b-991e-a038768b893a-kube-api-access-kqvkg\") pod \"controller-manager-6d486f5f58-5v2bq\" (UID: \"94a66935-e061-419b-991e-a038768b893a\") " pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.278693 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.293387 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.491446 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d486f5f58-5v2bq"] Jan 30 21:21:02 crc kubenswrapper[4834]: W0130 21:21:02.493155 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a66935_e061_419b_991e_a038768b893a.slice/crio-745f4c13fb1bedbddb6fab111c512746781455a33bbe971cc75f1d78ba7f20e6 WatchSource:0}: Error finding container 745f4c13fb1bedbddb6fab111c512746781455a33bbe971cc75f1d78ba7f20e6: Status 404 returned error can't find the container with id 745f4c13fb1bedbddb6fab111c512746781455a33bbe971cc75f1d78ba7f20e6 Jan 30 21:21:02 crc kubenswrapper[4834]: I0130 21:21:02.556154 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n"] Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.436518 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" event={"ID":"3b0729e4-75c4-4e4b-a440-fc685fc5958f","Type":"ContainerStarted","Data":"865626ac050ce7df0acccd1f7912e52a7e2a48dc3f5cf7b1b3239d78c8e00636"} Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.436593 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" event={"ID":"3b0729e4-75c4-4e4b-a440-fc685fc5958f","Type":"ContainerStarted","Data":"fffa563a4275d1437a5b14e7d1a068a877e6a0edcf761894c3cb0c667d036a15"} Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.436861 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.438409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" event={"ID":"94a66935-e061-419b-991e-a038768b893a","Type":"ContainerStarted","Data":"fa512ae7de8e77aa32c365802f0a9ba2dc3f00842c32af12db8edc739bf5d382"} Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.438462 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" event={"ID":"94a66935-e061-419b-991e-a038768b893a","Type":"ContainerStarted","Data":"745f4c13fb1bedbddb6fab111c512746781455a33bbe971cc75f1d78ba7f20e6"} Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.438561 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.441977 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.443183 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.457118 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" podStartSLOduration=3.457104696 podStartE2EDuration="3.457104696s" podCreationTimestamp="2026-01-30 21:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:21:03.456159247 +0000 UTC m=+314.609305385" watchObservedRunningTime="2026-01-30 21:21:03.457104696 +0000 UTC m=+314.610250834" Jan 30 21:21:03 crc kubenswrapper[4834]: I0130 21:21:03.516996 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d486f5f58-5v2bq" podStartSLOduration=4.5169777490000005 podStartE2EDuration="4.516977749s" podCreationTimestamp="2026-01-30 21:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:21:03.475228974 +0000 UTC m=+314.628375132" watchObservedRunningTime="2026-01-30 21:21:03.516977749 +0000 UTC m=+314.670123887" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.619456 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4jx29"] Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.621123 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.640331 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4jx29"] Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.758860 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-registry-tls\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.758912 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ab70e832-6459-4880-a982-98f0cc79c601-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.758943 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ab70e832-6459-4880-a982-98f0cc79c601-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.758968 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.758986 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92cr\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-kube-api-access-g92cr\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.759005 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ab70e832-6459-4880-a982-98f0cc79c601-registry-certificates\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.759023 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-bound-sa-token\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.759055 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab70e832-6459-4880-a982-98f0cc79c601-trusted-ca\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.799843 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.860105 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab70e832-6459-4880-a982-98f0cc79c601-trusted-ca\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.860479 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-registry-tls\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.860613 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ab70e832-6459-4880-a982-98f0cc79c601-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.860771 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ab70e832-6459-4880-a982-98f0cc79c601-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.860885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g92cr\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-kube-api-access-g92cr\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.861001 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ab70e832-6459-4880-a982-98f0cc79c601-registry-certificates\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.861112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-bound-sa-token\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.861366 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ab70e832-6459-4880-a982-98f0cc79c601-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.862453 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab70e832-6459-4880-a982-98f0cc79c601-trusted-ca\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.863010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ab70e832-6459-4880-a982-98f0cc79c601-registry-certificates\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.866542 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ab70e832-6459-4880-a982-98f0cc79c601-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.867022 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-registry-tls\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.882173 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-bound-sa-token\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.886787 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g92cr\" (UniqueName: \"kubernetes.io/projected/ab70e832-6459-4880-a982-98f0cc79c601-kube-api-access-g92cr\") pod \"image-registry-66df7c8f76-4jx29\" (UID: \"ab70e832-6459-4880-a982-98f0cc79c601\") " pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:09 crc kubenswrapper[4834]: I0130 21:21:09.946230 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:10 crc kubenswrapper[4834]: I0130 21:21:10.386094 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4jx29"] Jan 30 21:21:10 crc kubenswrapper[4834]: I0130 21:21:10.482269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" event={"ID":"ab70e832-6459-4880-a982-98f0cc79c601","Type":"ContainerStarted","Data":"d5a87795bb05f0874f981140ec2a4bf814266ae799b29e0e1570c252e7ad42cb"} Jan 30 21:21:11 crc kubenswrapper[4834]: I0130 21:21:11.489717 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" event={"ID":"ab70e832-6459-4880-a982-98f0cc79c601","Type":"ContainerStarted","Data":"8c789e6d9315fe10ddbcbc4bc2bd94097c713958cfe9478e11360975c600177a"} Jan 30 21:21:11 crc kubenswrapper[4834]: I0130 21:21:11.490011 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:11 crc kubenswrapper[4834]: I0130 21:21:11.541132 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" podStartSLOduration=2.541111885 podStartE2EDuration="2.541111885s" podCreationTimestamp="2026-01-30 21:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:21:11.534482801 +0000 UTC m=+322.687628979" watchObservedRunningTime="2026-01-30 21:21:11.541111885 +0000 UTC m=+322.694258033" Jan 30 21:21:29 crc kubenswrapper[4834]: I0130 21:21:29.953794 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4jx29" Jan 30 21:21:30 crc kubenswrapper[4834]: I0130 21:21:30.034634 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pfvpm"] Jan 30 21:21:34 crc kubenswrapper[4834]: I0130 21:21:34.161079 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:21:34 crc kubenswrapper[4834]: I0130 21:21:34.161539 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.501209 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-82j9j"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.503668 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-82j9j" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="registry-server" containerID="cri-o://3ef5d92d4409fbb36784676c33dceff9c72e99e8c39c0d47370be281e229b9dc" gracePeriod=30 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.526454 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-blc9b"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.526883 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-blc9b" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="registry-server" containerID="cri-o://6d673b42d73d416821a96ffceadbda400138b8d30efdc72ef16245d229d08a67" gracePeriod=30 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.542739 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9kdrb"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.543096 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" containerID="cri-o://5912761b1d49fe327b7005cc40b8b47231cb943968aca0878ea672b212cdfaed" gracePeriod=30 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.552849 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q6bh"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.553169 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9q6bh" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="registry-server" containerID="cri-o://166a044c17c9fc3d0768bb76d46bb024212d7535f8f39ccc71869a751772619c" gracePeriod=30 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.565474 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xpwnw"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.565760 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xpwnw" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="registry-server" containerID="cri-o://2eab9a1aeb1b44fcd4c435f34581bd7f8f7af74ef1d26c5664001c9ce0032f2e" gracePeriod=30 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.574656 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lq5l9"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.575682 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.581151 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lq5l9"] Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.603925 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-9q6bh" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="registry-server" probeResult="failure" output="" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.608949 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-9q6bh" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="registry-server" probeResult="failure" output="" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.667942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.668298 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46ws\" (UniqueName: \"kubernetes.io/projected/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-kube-api-access-f46ws\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.668385 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.753327 4834 generic.go:334] "Generic (PLEG): container finished" podID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerID="5912761b1d49fe327b7005cc40b8b47231cb943968aca0878ea672b212cdfaed" exitCode=0 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.753418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" event={"ID":"aab9a42d-c833-46b2-a745-1bb95ada7f68","Type":"ContainerDied","Data":"5912761b1d49fe327b7005cc40b8b47231cb943968aca0878ea672b212cdfaed"} Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.753566 4834 scope.go:117] "RemoveContainer" containerID="f1e3f597d5bb3d4210ce321e536af817f9bb98d856b761ab9d758e1a17d7fc75" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.758625 4834 generic.go:334] "Generic (PLEG): container finished" podID="be0cb498-ae6b-47f1-8068-9f7653206006" containerID="6d673b42d73d416821a96ffceadbda400138b8d30efdc72ef16245d229d08a67" exitCode=0 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.758706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerDied","Data":"6d673b42d73d416821a96ffceadbda400138b8d30efdc72ef16245d229d08a67"} Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.761328 4834 generic.go:334] "Generic (PLEG): container finished" podID="d19012b5-546c-4af0-a419-f57194f5eff8" containerID="3ef5d92d4409fbb36784676c33dceff9c72e99e8c39c0d47370be281e229b9dc" exitCode=0 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.761375 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82j9j" event={"ID":"d19012b5-546c-4af0-a419-f57194f5eff8","Type":"ContainerDied","Data":"3ef5d92d4409fbb36784676c33dceff9c72e99e8c39c0d47370be281e229b9dc"} Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.764771 4834 generic.go:334] "Generic (PLEG): container finished" podID="433db183-f17f-4f55-b6f4-901614906a48" containerID="166a044c17c9fc3d0768bb76d46bb024212d7535f8f39ccc71869a751772619c" exitCode=0 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.764818 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerDied","Data":"166a044c17c9fc3d0768bb76d46bb024212d7535f8f39ccc71869a751772619c"} Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.766467 4834 generic.go:334] "Generic (PLEG): container finished" podID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerID="2eab9a1aeb1b44fcd4c435f34581bd7f8f7af74ef1d26c5664001c9ce0032f2e" exitCode=0 Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.766502 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpwnw" event={"ID":"864b4107-52a4-4db4-a6f7-ca80d4122d26","Type":"ContainerDied","Data":"2eab9a1aeb1b44fcd4c435f34581bd7f8f7af74ef1d26c5664001c9ce0032f2e"} Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.769353 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46ws\" (UniqueName: \"kubernetes.io/projected/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-kube-api-access-f46ws\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.769488 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.769531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.771427 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.778670 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.787965 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46ws\" (UniqueName: \"kubernetes.io/projected/92a7c64f-4f7c-473c-94cd-3ec4e3ae546e-kube-api-access-f46ws\") pod \"marketplace-operator-79b997595-lq5l9\" (UID: \"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e\") " pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:50 crc kubenswrapper[4834]: I0130 21:21:50.891003 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.020026 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.086115 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.121862 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.124092 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.147693 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.175498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-utilities\") pod \"d19012b5-546c-4af0-a419-f57194f5eff8\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.175584 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-operator-metrics\") pod \"aab9a42d-c833-46b2-a745-1bb95ada7f68\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.175625 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/d19012b5-546c-4af0-a419-f57194f5eff8-kube-api-access-d9mqv\") pod \"d19012b5-546c-4af0-a419-f57194f5eff8\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.175671 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-trusted-ca\") pod \"aab9a42d-c833-46b2-a745-1bb95ada7f68\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.175694 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-catalog-content\") pod \"d19012b5-546c-4af0-a419-f57194f5eff8\" (UID: \"d19012b5-546c-4af0-a419-f57194f5eff8\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.175728 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djl5x\" (UniqueName: \"kubernetes.io/projected/aab9a42d-c833-46b2-a745-1bb95ada7f68-kube-api-access-djl5x\") pod \"aab9a42d-c833-46b2-a745-1bb95ada7f68\" (UID: \"aab9a42d-c833-46b2-a745-1bb95ada7f68\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.177083 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-utilities" (OuterVolumeSpecName: "utilities") pod "d19012b5-546c-4af0-a419-f57194f5eff8" (UID: "d19012b5-546c-4af0-a419-f57194f5eff8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.177740 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "aab9a42d-c833-46b2-a745-1bb95ada7f68" (UID: "aab9a42d-c833-46b2-a745-1bb95ada7f68"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.183530 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19012b5-546c-4af0-a419-f57194f5eff8-kube-api-access-d9mqv" (OuterVolumeSpecName: "kube-api-access-d9mqv") pod "d19012b5-546c-4af0-a419-f57194f5eff8" (UID: "d19012b5-546c-4af0-a419-f57194f5eff8"). InnerVolumeSpecName "kube-api-access-d9mqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.183766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "aab9a42d-c833-46b2-a745-1bb95ada7f68" (UID: "aab9a42d-c833-46b2-a745-1bb95ada7f68"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.191128 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab9a42d-c833-46b2-a745-1bb95ada7f68-kube-api-access-djl5x" (OuterVolumeSpecName: "kube-api-access-djl5x") pod "aab9a42d-c833-46b2-a745-1bb95ada7f68" (UID: "aab9a42d-c833-46b2-a745-1bb95ada7f68"). InnerVolumeSpecName "kube-api-access-djl5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.244191 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d19012b5-546c-4af0-a419-f57194f5eff8" (UID: "d19012b5-546c-4af0-a419-f57194f5eff8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.272073 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lq5l9"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.276534 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-utilities\") pod \"433db183-f17f-4f55-b6f4-901614906a48\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.276623 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-catalog-content\") pod \"433db183-f17f-4f55-b6f4-901614906a48\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.276650 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-utilities\") pod \"be0cb498-ae6b-47f1-8068-9f7653206006\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.276681 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-catalog-content\") pod \"864b4107-52a4-4db4-a6f7-ca80d4122d26\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.276703 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbj9z\" (UniqueName: \"kubernetes.io/projected/be0cb498-ae6b-47f1-8068-9f7653206006-kube-api-access-sbj9z\") pod \"be0cb498-ae6b-47f1-8068-9f7653206006\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.276732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-catalog-content\") pod \"be0cb498-ae6b-47f1-8068-9f7653206006\" (UID: \"be0cb498-ae6b-47f1-8068-9f7653206006\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277212 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-utilities\") pod \"864b4107-52a4-4db4-a6f7-ca80d4122d26\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq88h\" (UniqueName: \"kubernetes.io/projected/433db183-f17f-4f55-b6f4-901614906a48-kube-api-access-xq88h\") pod \"433db183-f17f-4f55-b6f4-901614906a48\" (UID: \"433db183-f17f-4f55-b6f4-901614906a48\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277265 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjdkr\" (UniqueName: \"kubernetes.io/projected/864b4107-52a4-4db4-a6f7-ca80d4122d26-kube-api-access-qjdkr\") pod \"864b4107-52a4-4db4-a6f7-ca80d4122d26\" (UID: \"864b4107-52a4-4db4-a6f7-ca80d4122d26\") " Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277334 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-utilities" (OuterVolumeSpecName: "utilities") pod "be0cb498-ae6b-47f1-8068-9f7653206006" (UID: "be0cb498-ae6b-47f1-8068-9f7653206006"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277483 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277495 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277505 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277542 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9mqv\" (UniqueName: \"kubernetes.io/projected/d19012b5-546c-4af0-a419-f57194f5eff8-kube-api-access-d9mqv\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277550 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab9a42d-c833-46b2-a745-1bb95ada7f68-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277558 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19012b5-546c-4af0-a419-f57194f5eff8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277566 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djl5x\" (UniqueName: \"kubernetes.io/projected/aab9a42d-c833-46b2-a745-1bb95ada7f68-kube-api-access-djl5x\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.277848 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-utilities" (OuterVolumeSpecName: "utilities") pod "864b4107-52a4-4db4-a6f7-ca80d4122d26" (UID: "864b4107-52a4-4db4-a6f7-ca80d4122d26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.278447 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-utilities" (OuterVolumeSpecName: "utilities") pod "433db183-f17f-4f55-b6f4-901614906a48" (UID: "433db183-f17f-4f55-b6f4-901614906a48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.280292 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0cb498-ae6b-47f1-8068-9f7653206006-kube-api-access-sbj9z" (OuterVolumeSpecName: "kube-api-access-sbj9z") pod "be0cb498-ae6b-47f1-8068-9f7653206006" (UID: "be0cb498-ae6b-47f1-8068-9f7653206006"). InnerVolumeSpecName "kube-api-access-sbj9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.280579 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864b4107-52a4-4db4-a6f7-ca80d4122d26-kube-api-access-qjdkr" (OuterVolumeSpecName: "kube-api-access-qjdkr") pod "864b4107-52a4-4db4-a6f7-ca80d4122d26" (UID: "864b4107-52a4-4db4-a6f7-ca80d4122d26"). InnerVolumeSpecName "kube-api-access-qjdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.280904 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433db183-f17f-4f55-b6f4-901614906a48-kube-api-access-xq88h" (OuterVolumeSpecName: "kube-api-access-xq88h") pod "433db183-f17f-4f55-b6f4-901614906a48" (UID: "433db183-f17f-4f55-b6f4-901614906a48"). InnerVolumeSpecName "kube-api-access-xq88h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.324024 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "433db183-f17f-4f55-b6f4-901614906a48" (UID: "433db183-f17f-4f55-b6f4-901614906a48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.337898 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be0cb498-ae6b-47f1-8068-9f7653206006" (UID: "be0cb498-ae6b-47f1-8068-9f7653206006"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379084 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379127 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq88h\" (UniqueName: \"kubernetes.io/projected/433db183-f17f-4f55-b6f4-901614906a48-kube-api-access-xq88h\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379139 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjdkr\" (UniqueName: \"kubernetes.io/projected/864b4107-52a4-4db4-a6f7-ca80d4122d26-kube-api-access-qjdkr\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379147 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379156 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/433db183-f17f-4f55-b6f4-901614906a48-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379165 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbj9z\" (UniqueName: \"kubernetes.io/projected/be0cb498-ae6b-47f1-8068-9f7653206006-kube-api-access-sbj9z\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.379190 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be0cb498-ae6b-47f1-8068-9f7653206006-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.404906 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "864b4107-52a4-4db4-a6f7-ca80d4122d26" (UID: "864b4107-52a4-4db4-a6f7-ca80d4122d26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.480444 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/864b4107-52a4-4db4-a6f7-ca80d4122d26-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.772771 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" event={"ID":"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e","Type":"ContainerStarted","Data":"7d6e93eb39af99c63bb688044ec9f6ab746fd0297cfee2ccc0b49893a5b8492f"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.772831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" event={"ID":"92a7c64f-4f7c-473c-94cd-3ec4e3ae546e","Type":"ContainerStarted","Data":"014e65ad03c5d0be02bb6d74cf77ca7e1853b5e569b237635dcad933c4e8fae3"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.773043 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.775684 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-blc9b" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.776415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-blc9b" event={"ID":"be0cb498-ae6b-47f1-8068-9f7653206006","Type":"ContainerDied","Data":"aaecc91b58d80c614022df9a75dc08eae3e7e7e69d7b9aaec5ee621c8ce04644"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.776465 4834 scope.go:117] "RemoveContainer" containerID="6d673b42d73d416821a96ffceadbda400138b8d30efdc72ef16245d229d08a67" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.776789 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.780022 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-82j9j" event={"ID":"d19012b5-546c-4af0-a419-f57194f5eff8","Type":"ContainerDied","Data":"d68c8c55ed074bef04bd26ea319dd361389622fe8a4db7d5fd8768e0132e6ad0"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.780105 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-82j9j" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.786919 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9q6bh" event={"ID":"433db183-f17f-4f55-b6f4-901614906a48","Type":"ContainerDied","Data":"cb10fd780459218ab3d2e6c78d2be58c0ab75af312e72887b69f184bfd45bf3f"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.787079 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9q6bh" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.789560 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lq5l9" podStartSLOduration=1.7895470119999999 podStartE2EDuration="1.789547012s" podCreationTimestamp="2026-01-30 21:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:21:51.788623043 +0000 UTC m=+362.941769191" watchObservedRunningTime="2026-01-30 21:21:51.789547012 +0000 UTC m=+362.942693160" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.790691 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xpwnw" event={"ID":"864b4107-52a4-4db4-a6f7-ca80d4122d26","Type":"ContainerDied","Data":"305828a82b1d084818d6d9ec10dc44b438a6ef75516fd8ef37186d7affeb1a44"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.790758 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xpwnw" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.793454 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" event={"ID":"aab9a42d-c833-46b2-a745-1bb95ada7f68","Type":"ContainerDied","Data":"11160ce93d171339a3eaddddc1aecfd0c05723be0c4eb057f68a59b99220136d"} Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.793599 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9kdrb" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.796637 4834 scope.go:117] "RemoveContainer" containerID="07a2a2b436de6881879ee6955630e0cacd032a8e0126365cce92f5e5b19498d4" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.808738 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-82j9j"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.816461 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-82j9j"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.819562 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-blc9b"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.820090 4834 scope.go:117] "RemoveContainer" containerID="097ebf03473446b13c8c0c16a95615348ccf36ae363dc047dbbaf66f8520bb9d" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.823080 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-blc9b"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.853099 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9kdrb"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.853430 4834 scope.go:117] "RemoveContainer" containerID="3ef5d92d4409fbb36784676c33dceff9c72e99e8c39c0d47370be281e229b9dc" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.857020 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9kdrb"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.865155 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xpwnw"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.870544 4834 scope.go:117] "RemoveContainer" containerID="c661d5053142de574fe904aa8ad3a217084af08e2a4e69992c48113343554d21" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.871879 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xpwnw"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.884829 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q6bh"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.887348 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9q6bh"] Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.894212 4834 scope.go:117] "RemoveContainer" containerID="13862b67399434f77a503e0853c1f982148cace42f650e586eb46ea02172c7ae" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.907725 4834 scope.go:117] "RemoveContainer" containerID="166a044c17c9fc3d0768bb76d46bb024212d7535f8f39ccc71869a751772619c" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.925646 4834 scope.go:117] "RemoveContainer" containerID="ad15784b169ba0765ddd0e6a8c3c31eab3bb862092376c18e86d793223da1d48" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.937289 4834 scope.go:117] "RemoveContainer" containerID="7c1015e35b3686baab7cbc7116ac76be04162889325e2e44542f16bdb003a52e" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.947124 4834 scope.go:117] "RemoveContainer" containerID="2eab9a1aeb1b44fcd4c435f34581bd7f8f7af74ef1d26c5664001c9ce0032f2e" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.957119 4834 scope.go:117] "RemoveContainer" containerID="d15eeb0972dde61a8fc29b96c45d5b8ec3cb738090417ded31ba59a33d8b43ba" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.968261 4834 scope.go:117] "RemoveContainer" containerID="3510a33f834cae793af7308cc5588e6d27f3af17e6897c8dd94013cb0dba0cd4" Jan 30 21:21:51 crc kubenswrapper[4834]: I0130 21:21:51.979005 4834 scope.go:117] "RemoveContainer" containerID="5912761b1d49fe327b7005cc40b8b47231cb943968aca0878ea672b212cdfaed" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729142 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2k5hc"] Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729489 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729512 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729526 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729538 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729554 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729565 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729578 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729587 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729599 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729609 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729622 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729632 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729647 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729658 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="extract-utilities" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729674 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729684 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729701 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729711 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729725 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729734 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729751 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729762 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="extract-content" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729776 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729801 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729817 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729828 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: E0130 21:21:52.729842 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729861 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.729998 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="433db183-f17f-4f55-b6f4-901614906a48" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.730014 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.730031 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.730048 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.730060 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" containerName="marketplace-operator" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.730074 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" containerName="registry-server" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.731083 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.735187 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.748858 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k5hc"] Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.900132 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-utilities\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.900512 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f6c2\" (UniqueName: \"kubernetes.io/projected/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-kube-api-access-7f6c2\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.900566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-catalog-content\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.939161 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r7dp9"] Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.940805 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.943294 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r7dp9"] Jan 30 21:21:52 crc kubenswrapper[4834]: I0130 21:21:52.944649 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.001898 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-utilities\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.002309 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f6c2\" (UniqueName: \"kubernetes.io/projected/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-kube-api-access-7f6c2\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.002350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-catalog-content\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.002659 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-utilities\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.003100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-catalog-content\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.031649 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f6c2\" (UniqueName: \"kubernetes.io/projected/ae12279c-fd14-40dd-a3bd-c3d5ceaac331-kube-api-access-7f6c2\") pod \"redhat-marketplace-2k5hc\" (UID: \"ae12279c-fd14-40dd-a3bd-c3d5ceaac331\") " pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.064566 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.103387 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c12556-80ec-42b1-9d47-ead9224f86ff-catalog-content\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.103656 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn629\" (UniqueName: \"kubernetes.io/projected/e8c12556-80ec-42b1-9d47-ead9224f86ff-kube-api-access-qn629\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.103810 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c12556-80ec-42b1-9d47-ead9224f86ff-utilities\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.205288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c12556-80ec-42b1-9d47-ead9224f86ff-catalog-content\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.205326 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn629\" (UniqueName: \"kubernetes.io/projected/e8c12556-80ec-42b1-9d47-ead9224f86ff-kube-api-access-qn629\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.205362 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c12556-80ec-42b1-9d47-ead9224f86ff-utilities\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.206185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c12556-80ec-42b1-9d47-ead9224f86ff-utilities\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.206236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c12556-80ec-42b1-9d47-ead9224f86ff-catalog-content\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.225837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn629\" (UniqueName: \"kubernetes.io/projected/e8c12556-80ec-42b1-9d47-ead9224f86ff-kube-api-access-qn629\") pod \"redhat-operators-r7dp9\" (UID: \"e8c12556-80ec-42b1-9d47-ead9224f86ff\") " pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.258081 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.506271 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2k5hc"] Jan 30 21:21:53 crc kubenswrapper[4834]: W0130 21:21:53.512247 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae12279c_fd14_40dd_a3bd_c3d5ceaac331.slice/crio-8296043d7fd91fc997766f961b0c700a9973c198bedf74d9850b9e71ed8c5beb WatchSource:0}: Error finding container 8296043d7fd91fc997766f961b0c700a9973c198bedf74d9850b9e71ed8c5beb: Status 404 returned error can't find the container with id 8296043d7fd91fc997766f961b0c700a9973c198bedf74d9850b9e71ed8c5beb Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.544304 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="433db183-f17f-4f55-b6f4-901614906a48" path="/var/lib/kubelet/pods/433db183-f17f-4f55-b6f4-901614906a48/volumes" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.545242 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864b4107-52a4-4db4-a6f7-ca80d4122d26" path="/var/lib/kubelet/pods/864b4107-52a4-4db4-a6f7-ca80d4122d26/volumes" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.546122 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab9a42d-c833-46b2-a745-1bb95ada7f68" path="/var/lib/kubelet/pods/aab9a42d-c833-46b2-a745-1bb95ada7f68/volumes" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.547290 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0cb498-ae6b-47f1-8068-9f7653206006" path="/var/lib/kubelet/pods/be0cb498-ae6b-47f1-8068-9f7653206006/volumes" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.548115 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19012b5-546c-4af0-a419-f57194f5eff8" path="/var/lib/kubelet/pods/d19012b5-546c-4af0-a419-f57194f5eff8/volumes" Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.662866 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r7dp9"] Jan 30 21:21:53 crc kubenswrapper[4834]: W0130 21:21:53.667034 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c12556_80ec_42b1_9d47_ead9224f86ff.slice/crio-46302876dfd673b5e0c19714a17be7c2bf6c2b89a6a0d36b03bca70e5e34c712 WatchSource:0}: Error finding container 46302876dfd673b5e0c19714a17be7c2bf6c2b89a6a0d36b03bca70e5e34c712: Status 404 returned error can't find the container with id 46302876dfd673b5e0c19714a17be7c2bf6c2b89a6a0d36b03bca70e5e34c712 Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.817032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7dp9" event={"ID":"e8c12556-80ec-42b1-9d47-ead9224f86ff","Type":"ContainerStarted","Data":"46302876dfd673b5e0c19714a17be7c2bf6c2b89a6a0d36b03bca70e5e34c712"} Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.818496 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae12279c-fd14-40dd-a3bd-c3d5ceaac331" containerID="a9dcfebaf7c01a493743e49a39cdf6321987dfb35a9316607d78cd799206b2f8" exitCode=0 Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.818584 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k5hc" event={"ID":"ae12279c-fd14-40dd-a3bd-c3d5ceaac331","Type":"ContainerDied","Data":"a9dcfebaf7c01a493743e49a39cdf6321987dfb35a9316607d78cd799206b2f8"} Jan 30 21:21:53 crc kubenswrapper[4834]: I0130 21:21:53.818610 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k5hc" event={"ID":"ae12279c-fd14-40dd-a3bd-c3d5ceaac331","Type":"ContainerStarted","Data":"8296043d7fd91fc997766f961b0c700a9973c198bedf74d9850b9e71ed8c5beb"} Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.441902 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n"] Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.443123 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" podUID="3b0729e4-75c4-4e4b-a440-fc685fc5958f" containerName="route-controller-manager" containerID="cri-o://865626ac050ce7df0acccd1f7912e52a7e2a48dc3f5cf7b1b3239d78c8e00636" gracePeriod=30 Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.825254 4834 generic.go:334] "Generic (PLEG): container finished" podID="e8c12556-80ec-42b1-9d47-ead9224f86ff" containerID="cdd2bff120e6a3ff676c2e32f365da6c1e0b1d5a653ddd51eb79fcfb28064451" exitCode=0 Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.825418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7dp9" event={"ID":"e8c12556-80ec-42b1-9d47-ead9224f86ff","Type":"ContainerDied","Data":"cdd2bff120e6a3ff676c2e32f365da6c1e0b1d5a653ddd51eb79fcfb28064451"} Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.827025 4834 generic.go:334] "Generic (PLEG): container finished" podID="3b0729e4-75c4-4e4b-a440-fc685fc5958f" containerID="865626ac050ce7df0acccd1f7912e52a7e2a48dc3f5cf7b1b3239d78c8e00636" exitCode=0 Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.827079 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" event={"ID":"3b0729e4-75c4-4e4b-a440-fc685fc5958f","Type":"ContainerDied","Data":"865626ac050ce7df0acccd1f7912e52a7e2a48dc3f5cf7b1b3239d78c8e00636"} Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.827099 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" event={"ID":"3b0729e4-75c4-4e4b-a440-fc685fc5958f","Type":"ContainerDied","Data":"fffa563a4275d1437a5b14e7d1a068a877e6a0edcf761894c3cb0c667d036a15"} Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.827109 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fffa563a4275d1437a5b14e7d1a068a877e6a0edcf761894c3cb0c667d036a15" Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.848484 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae12279c-fd14-40dd-a3bd-c3d5ceaac331" containerID="0562d3a0992384d369f55fe9ffcccc09271d39a30d4a407d458a685c324a1f74" exitCode=0 Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.848534 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k5hc" event={"ID":"ae12279c-fd14-40dd-a3bd-c3d5ceaac331","Type":"ContainerDied","Data":"0562d3a0992384d369f55fe9ffcccc09271d39a30d4a407d458a685c324a1f74"} Jan 30 21:21:54 crc kubenswrapper[4834]: I0130 21:21:54.863357 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.044370 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-config\") pod \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.044677 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b0729e4-75c4-4e4b-a440-fc685fc5958f-serving-cert\") pod \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.044711 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpb89\" (UniqueName: \"kubernetes.io/projected/3b0729e4-75c4-4e4b-a440-fc685fc5958f-kube-api-access-xpb89\") pod \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.045522 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-client-ca" (OuterVolumeSpecName: "client-ca") pod "3b0729e4-75c4-4e4b-a440-fc685fc5958f" (UID: "3b0729e4-75c4-4e4b-a440-fc685fc5958f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.044783 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-client-ca\") pod \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\" (UID: \"3b0729e4-75c4-4e4b-a440-fc685fc5958f\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.045629 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-config" (OuterVolumeSpecName: "config") pod "3b0729e4-75c4-4e4b-a440-fc685fc5958f" (UID: "3b0729e4-75c4-4e4b-a440-fc685fc5958f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.045859 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.045876 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b0729e4-75c4-4e4b-a440-fc685fc5958f-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.051295 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b0729e4-75c4-4e4b-a440-fc685fc5958f-kube-api-access-xpb89" (OuterVolumeSpecName: "kube-api-access-xpb89") pod "3b0729e4-75c4-4e4b-a440-fc685fc5958f" (UID: "3b0729e4-75c4-4e4b-a440-fc685fc5958f"). InnerVolumeSpecName "kube-api-access-xpb89". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.051316 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b0729e4-75c4-4e4b-a440-fc685fc5958f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3b0729e4-75c4-4e4b-a440-fc685fc5958f" (UID: "3b0729e4-75c4-4e4b-a440-fc685fc5958f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.098731 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" podUID="58995455-5f53-49bb-84e7-dab094ffec5b" containerName="registry" containerID="cri-o://8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b" gracePeriod=30 Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.128261 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xmwt6"] Jan 30 21:21:55 crc kubenswrapper[4834]: E0130 21:21:55.128481 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b0729e4-75c4-4e4b-a440-fc685fc5958f" containerName="route-controller-manager" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.128492 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0729e4-75c4-4e4b-a440-fc685fc5958f" containerName="route-controller-manager" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.128577 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b0729e4-75c4-4e4b-a440-fc685fc5958f" containerName="route-controller-manager" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.129201 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.131898 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.139211 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmwt6"] Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.148099 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b0729e4-75c4-4e4b-a440-fc685fc5958f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.148131 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpb89\" (UniqueName: \"kubernetes.io/projected/3b0729e4-75c4-4e4b-a440-fc685fc5958f-kube-api-access-xpb89\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.249129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfxd\" (UniqueName: \"kubernetes.io/projected/1421282d-913d-46ae-b270-fdde46df72a3-kube-api-access-ggfxd\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.249266 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-utilities\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.249355 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-catalog-content\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.324358 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-67tqx"] Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.325211 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.330807 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.334515 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67tqx"] Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360000 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-utilities\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360072 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-catalog-content\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360142 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpmmp\" (UniqueName: \"kubernetes.io/projected/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-kube-api-access-gpmmp\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360172 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-catalog-content\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360197 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfxd\" (UniqueName: \"kubernetes.io/projected/1421282d-913d-46ae-b270-fdde46df72a3-kube-api-access-ggfxd\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360223 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-utilities\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-utilities\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.360593 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-catalog-content\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.377971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfxd\" (UniqueName: \"kubernetes.io/projected/1421282d-913d-46ae-b270-fdde46df72a3-kube-api-access-ggfxd\") pod \"certified-operators-xmwt6\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.449882 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461037 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-trusted-ca\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461088 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58995455-5f53-49bb-84e7-dab094ffec5b-ca-trust-extracted\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461212 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461243 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58995455-5f53-49bb-84e7-dab094ffec5b-installation-pull-secrets\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461266 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-bound-sa-token\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461293 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-registry-tls\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461317 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cfx5\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-kube-api-access-2cfx5\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461342 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-registry-certificates\") pod \"58995455-5f53-49bb-84e7-dab094ffec5b\" (UID: \"58995455-5f53-49bb-84e7-dab094ffec5b\") " Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461504 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpmmp\" (UniqueName: \"kubernetes.io/projected/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-kube-api-access-gpmmp\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461545 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-catalog-content\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461568 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-utilities\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.461869 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.462001 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-utilities\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.463142 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.463433 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-catalog-content\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.466051 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.466284 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-kube-api-access-2cfx5" (OuterVolumeSpecName: "kube-api-access-2cfx5") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "kube-api-access-2cfx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.469229 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.465061 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58995455-5f53-49bb-84e7-dab094ffec5b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.477733 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.481254 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpmmp\" (UniqueName: \"kubernetes.io/projected/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-kube-api-access-gpmmp\") pod \"community-operators-67tqx\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.488542 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58995455-5f53-49bb-84e7-dab094ffec5b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "58995455-5f53-49bb-84e7-dab094ffec5b" (UID: "58995455-5f53-49bb-84e7-dab094ffec5b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.491548 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569124 4834 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569152 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cfx5\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-kube-api-access-2cfx5\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569164 4834 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569171 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58995455-5f53-49bb-84e7-dab094ffec5b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569179 4834 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58995455-5f53-49bb-84e7-dab094ffec5b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569188 4834 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58995455-5f53-49bb-84e7-dab094ffec5b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.569196 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58995455-5f53-49bb-84e7-dab094ffec5b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.696467 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.857724 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2k5hc" event={"ID":"ae12279c-fd14-40dd-a3bd-c3d5ceaac331","Type":"ContainerStarted","Data":"eb7fca1df92dabf94fb18fa6d51ee6520ee347b8eaa13e3b6bb69aa92550281f"} Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.859626 4834 generic.go:334] "Generic (PLEG): container finished" podID="58995455-5f53-49bb-84e7-dab094ffec5b" containerID="8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b" exitCode=0 Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.859701 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.859996 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" event={"ID":"58995455-5f53-49bb-84e7-dab094ffec5b","Type":"ContainerDied","Data":"8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b"} Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.860052 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" event={"ID":"58995455-5f53-49bb-84e7-dab094ffec5b","Type":"ContainerDied","Data":"5375b2fb94667eb26ba79cd698522217b7d93328e7f1941aebb3cb964a060863"} Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.860077 4834 scope.go:117] "RemoveContainer" containerID="8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.860937 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pfvpm" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.876512 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2k5hc" podStartSLOduration=2.159590091 podStartE2EDuration="3.876489279s" podCreationTimestamp="2026-01-30 21:21:52 +0000 UTC" firstStartedPulling="2026-01-30 21:21:53.820125554 +0000 UTC m=+364.973271692" lastFinishedPulling="2026-01-30 21:21:55.537024742 +0000 UTC m=+366.690170880" observedRunningTime="2026-01-30 21:21:55.874383475 +0000 UTC m=+367.027529613" watchObservedRunningTime="2026-01-30 21:21:55.876489279 +0000 UTC m=+367.029635427" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.881856 4834 scope.go:117] "RemoveContainer" containerID="8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b" Jan 30 21:21:55 crc kubenswrapper[4834]: E0130 21:21:55.882210 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b\": container with ID starting with 8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b not found: ID does not exist" containerID="8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.882235 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b"} err="failed to get container status \"8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b\": rpc error: code = NotFound desc = could not find container \"8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b\": container with ID starting with 8576eb8e271352c24a0690bb9e065edf914b1c4a6dba6a47676e8bf69c99b79b not found: ID does not exist" Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.896647 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n"] Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.901614 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb989677-qss4n"] Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.914490 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pfvpm"] Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.917598 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pfvpm"] Jan 30 21:21:55 crc kubenswrapper[4834]: W0130 21:21:55.919756 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1421282d_913d_46ae_b270_fdde46df72a3.slice/crio-a1f6021aa7635ea2902feae3de727d0deb6c8f2e0c87ef5f2d4ce9c167550fa2 WatchSource:0}: Error finding container a1f6021aa7635ea2902feae3de727d0deb6c8f2e0c87ef5f2d4ce9c167550fa2: Status 404 returned error can't find the container with id a1f6021aa7635ea2902feae3de727d0deb6c8f2e0c87ef5f2d4ce9c167550fa2 Jan 30 21:21:55 crc kubenswrapper[4834]: I0130 21:21:55.924621 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmwt6"] Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.003705 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l"] Jan 30 21:21:56 crc kubenswrapper[4834]: E0130 21:21:56.004218 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58995455-5f53-49bb-84e7-dab094ffec5b" containerName="registry" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.004238 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="58995455-5f53-49bb-84e7-dab094ffec5b" containerName="registry" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.004356 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="58995455-5f53-49bb-84e7-dab094ffec5b" containerName="registry" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.004852 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.007842 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.008192 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.008366 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.009359 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.009573 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.010318 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l"] Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.011459 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.077160 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-serving-cert\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.077206 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrvlb\" (UniqueName: \"kubernetes.io/projected/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-kube-api-access-wrvlb\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.077239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-config\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.077431 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-client-ca\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.101759 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-67tqx"] Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.178383 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-serving-cert\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.178439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrvlb\" (UniqueName: \"kubernetes.io/projected/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-kube-api-access-wrvlb\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.178472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-config\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.178515 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-client-ca\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.179312 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-client-ca\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.180083 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-config\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.183727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-serving-cert\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.200557 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrvlb\" (UniqueName: \"kubernetes.io/projected/c5a78e2e-53ce-42e4-85d8-7cbdea9cd460-kube-api-access-wrvlb\") pod \"route-controller-manager-787dd497d8-vbw4l\" (UID: \"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460\") " pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.318809 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.702503 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l"] Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.866019 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerID="f7c0747ae9e96fba5c6e0704bde24a12243ada90ff27102130762b2e85b28fcd" exitCode=0 Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.866094 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerDied","Data":"f7c0747ae9e96fba5c6e0704bde24a12243ada90ff27102130762b2e85b28fcd"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.866163 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerStarted","Data":"14dee4a39ec89321c4b7d256fcf26458a8a13cbd2147ac93ab33076e3a8c87c7"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.868074 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" event={"ID":"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460","Type":"ContainerStarted","Data":"9e4c4f18ec16e2e1ddd71c04b19e006f754ef69750a7c26ca1f49c2af39745d0"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.868119 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" event={"ID":"c5a78e2e-53ce-42e4-85d8-7cbdea9cd460","Type":"ContainerStarted","Data":"9a7ac07d8f1240c463722a4ad1b723ba1133653c6372fb7553b4e0d31f20a4e4"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.869097 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.871234 4834 patch_prober.go:28] interesting pod/route-controller-manager-787dd497d8-vbw4l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.871286 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" podUID="c5a78e2e-53ce-42e4-85d8-7cbdea9cd460" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.875208 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7dp9" event={"ID":"e8c12556-80ec-42b1-9d47-ead9224f86ff","Type":"ContainerStarted","Data":"47889f83fb0fa45d25fafd9fe13772b73c6d162507167ddf7873157cc0dc0a9e"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.877429 4834 generic.go:334] "Generic (PLEG): container finished" podID="1421282d-913d-46ae-b270-fdde46df72a3" containerID="f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9" exitCode=0 Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.877805 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerDied","Data":"f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.877842 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerStarted","Data":"a1f6021aa7635ea2902feae3de727d0deb6c8f2e0c87ef5f2d4ce9c167550fa2"} Jan 30 21:21:56 crc kubenswrapper[4834]: I0130 21:21:56.907114 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" podStartSLOduration=2.907097327 podStartE2EDuration="2.907097327s" podCreationTimestamp="2026-01-30 21:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:21:56.901198146 +0000 UTC m=+368.054344304" watchObservedRunningTime="2026-01-30 21:21:56.907097327 +0000 UTC m=+368.060243455" Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.537175 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b0729e4-75c4-4e4b-a440-fc685fc5958f" path="/var/lib/kubelet/pods/3b0729e4-75c4-4e4b-a440-fc685fc5958f/volumes" Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.539054 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58995455-5f53-49bb-84e7-dab094ffec5b" path="/var/lib/kubelet/pods/58995455-5f53-49bb-84e7-dab094ffec5b/volumes" Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.885366 4834 generic.go:334] "Generic (PLEG): container finished" podID="e8c12556-80ec-42b1-9d47-ead9224f86ff" containerID="47889f83fb0fa45d25fafd9fe13772b73c6d162507167ddf7873157cc0dc0a9e" exitCode=0 Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.885440 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7dp9" event={"ID":"e8c12556-80ec-42b1-9d47-ead9224f86ff","Type":"ContainerDied","Data":"47889f83fb0fa45d25fafd9fe13772b73c6d162507167ddf7873157cc0dc0a9e"} Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.887569 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerStarted","Data":"39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c"} Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.889232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerStarted","Data":"a7270e9653808a9ff69cbbf224e0e31cc1d567ce4cc9ef35281292cf1d0d79a0"} Jan 30 21:21:57 crc kubenswrapper[4834]: I0130 21:21:57.894800 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-787dd497d8-vbw4l" Jan 30 21:21:58 crc kubenswrapper[4834]: I0130 21:21:58.895465 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r7dp9" event={"ID":"e8c12556-80ec-42b1-9d47-ead9224f86ff","Type":"ContainerStarted","Data":"3be73622ae01756f9dc2c69fd26453af44a843063d199630f3f20fd6f54d8371"} Jan 30 21:21:58 crc kubenswrapper[4834]: I0130 21:21:58.899026 4834 generic.go:334] "Generic (PLEG): container finished" podID="1421282d-913d-46ae-b270-fdde46df72a3" containerID="39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c" exitCode=0 Jan 30 21:21:58 crc kubenswrapper[4834]: I0130 21:21:58.899093 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerDied","Data":"39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c"} Jan 30 21:21:58 crc kubenswrapper[4834]: I0130 21:21:58.901792 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerID="a7270e9653808a9ff69cbbf224e0e31cc1d567ce4cc9ef35281292cf1d0d79a0" exitCode=0 Jan 30 21:21:58 crc kubenswrapper[4834]: I0130 21:21:58.901872 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerDied","Data":"a7270e9653808a9ff69cbbf224e0e31cc1d567ce4cc9ef35281292cf1d0d79a0"} Jan 30 21:21:58 crc kubenswrapper[4834]: I0130 21:21:58.923711 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r7dp9" podStartSLOduration=3.411975864 podStartE2EDuration="6.923694819s" podCreationTimestamp="2026-01-30 21:21:52 +0000 UTC" firstStartedPulling="2026-01-30 21:21:54.827183046 +0000 UTC m=+365.980329184" lastFinishedPulling="2026-01-30 21:21:58.338902001 +0000 UTC m=+369.492048139" observedRunningTime="2026-01-30 21:21:58.922578074 +0000 UTC m=+370.075724212" watchObservedRunningTime="2026-01-30 21:21:58.923694819 +0000 UTC m=+370.076840957" Jan 30 21:21:59 crc kubenswrapper[4834]: I0130 21:21:59.908060 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerStarted","Data":"244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce"} Jan 30 21:21:59 crc kubenswrapper[4834]: I0130 21:21:59.909763 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerStarted","Data":"bb40727cb8deff0d53a0a84d2aeaef020b5346724a6534ddcd5889d1290fe429"} Jan 30 21:21:59 crc kubenswrapper[4834]: I0130 21:21:59.927595 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xmwt6" podStartSLOduration=2.507197665 podStartE2EDuration="4.927582324s" podCreationTimestamp="2026-01-30 21:21:55 +0000 UTC" firstStartedPulling="2026-01-30 21:21:56.879062154 +0000 UTC m=+368.032208292" lastFinishedPulling="2026-01-30 21:21:59.299446823 +0000 UTC m=+370.452592951" observedRunningTime="2026-01-30 21:21:59.926336486 +0000 UTC m=+371.079482624" watchObservedRunningTime="2026-01-30 21:21:59.927582324 +0000 UTC m=+371.080728462" Jan 30 21:21:59 crc kubenswrapper[4834]: I0130 21:21:59.954126 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-67tqx" podStartSLOduration=2.4684267220000002 podStartE2EDuration="4.95411128s" podCreationTimestamp="2026-01-30 21:21:55 +0000 UTC" firstStartedPulling="2026-01-30 21:21:56.867456477 +0000 UTC m=+368.020602635" lastFinishedPulling="2026-01-30 21:21:59.353141055 +0000 UTC m=+370.506287193" observedRunningTime="2026-01-30 21:21:59.950581792 +0000 UTC m=+371.103727930" watchObservedRunningTime="2026-01-30 21:21:59.95411128 +0000 UTC m=+371.107257418" Jan 30 21:22:03 crc kubenswrapper[4834]: I0130 21:22:03.065574 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:22:03 crc kubenswrapper[4834]: I0130 21:22:03.065884 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:22:03 crc kubenswrapper[4834]: I0130 21:22:03.145879 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:22:03 crc kubenswrapper[4834]: I0130 21:22:03.258270 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:22:03 crc kubenswrapper[4834]: I0130 21:22:03.258348 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:22:03 crc kubenswrapper[4834]: I0130 21:22:03.974192 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2k5hc" Jan 30 21:22:04 crc kubenswrapper[4834]: I0130 21:22:04.161251 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:22:04 crc kubenswrapper[4834]: I0130 21:22:04.161880 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:22:04 crc kubenswrapper[4834]: I0130 21:22:04.325078 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r7dp9" podUID="e8c12556-80ec-42b1-9d47-ead9224f86ff" containerName="registry-server" probeResult="failure" output=< Jan 30 21:22:04 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:22:04 crc kubenswrapper[4834]: > Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.492757 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.493291 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.550309 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.697951 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.698013 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.757265 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.975609 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:22:05 crc kubenswrapper[4834]: I0130 21:22:05.976715 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:22:13 crc kubenswrapper[4834]: I0130 21:22:13.322124 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:22:13 crc kubenswrapper[4834]: I0130 21:22:13.390236 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r7dp9" Jan 30 21:22:34 crc kubenswrapper[4834]: I0130 21:22:34.161505 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:22:34 crc kubenswrapper[4834]: I0130 21:22:34.162112 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:22:34 crc kubenswrapper[4834]: I0130 21:22:34.162181 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:22:34 crc kubenswrapper[4834]: I0130 21:22:34.163118 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42c330620a3e86a82c2bb84c857d3ae702f97694fd939fb37ee985cfe42ce65b"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:22:34 crc kubenswrapper[4834]: I0130 21:22:34.163222 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://42c330620a3e86a82c2bb84c857d3ae702f97694fd939fb37ee985cfe42ce65b" gracePeriod=600 Jan 30 21:22:35 crc kubenswrapper[4834]: I0130 21:22:35.155594 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="42c330620a3e86a82c2bb84c857d3ae702f97694fd939fb37ee985cfe42ce65b" exitCode=0 Jan 30 21:22:35 crc kubenswrapper[4834]: I0130 21:22:35.155735 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"42c330620a3e86a82c2bb84c857d3ae702f97694fd939fb37ee985cfe42ce65b"} Jan 30 21:22:35 crc kubenswrapper[4834]: I0130 21:22:35.156040 4834 scope.go:117] "RemoveContainer" containerID="fb411fc22b97adae64e88403f5c3ceb778843857dd38d1c2d8767aada368c243" Jan 30 21:22:36 crc kubenswrapper[4834]: I0130 21:22:36.168974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"8ec49418611dac2916cceb2c6e5d860a3177886a0b42237393b287ec87bc697b"} Jan 30 21:25:04 crc kubenswrapper[4834]: I0130 21:25:04.161812 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:25:04 crc kubenswrapper[4834]: I0130 21:25:04.162636 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:25:34 crc kubenswrapper[4834]: I0130 21:25:34.160888 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:25:34 crc kubenswrapper[4834]: I0130 21:25:34.162099 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.161928 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.162769 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.162835 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.163645 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ec49418611dac2916cceb2c6e5d860a3177886a0b42237393b287ec87bc697b"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.163742 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://8ec49418611dac2916cceb2c6e5d860a3177886a0b42237393b287ec87bc697b" gracePeriod=600 Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.595263 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="8ec49418611dac2916cceb2c6e5d860a3177886a0b42237393b287ec87bc697b" exitCode=0 Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.595333 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"8ec49418611dac2916cceb2c6e5d860a3177886a0b42237393b287ec87bc697b"} Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.595688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"76c0ed7f9e9f321f65e1f6d65b4089c7729795a844b8db9bc32d4d6eeeb8f6b8"} Jan 30 21:26:04 crc kubenswrapper[4834]: I0130 21:26:04.595713 4834 scope.go:117] "RemoveContainer" containerID="42c330620a3e86a82c2bb84c857d3ae702f97694fd939fb37ee985cfe42ce65b" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.111292 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.112655 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.114555 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-89j5t" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.114588 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.115519 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.121524 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.127256 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-vsrc9"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.127977 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-vsrc9" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.129209 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-shxfw" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.133923 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-z5xqc"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.134482 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.136363 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nw4wd" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.144590 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-vsrc9"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.157601 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-z5xqc"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.195534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcg5h\" (UniqueName: \"kubernetes.io/projected/496f8c34-1261-4c09-9e7e-fa69c23cca44-kube-api-access-fcg5h\") pod \"cert-manager-858654f9db-vsrc9\" (UID: \"496f8c34-1261-4c09-9e7e-fa69c23cca44\") " pod="cert-manager/cert-manager-858654f9db-vsrc9" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.195602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-856mt\" (UniqueName: \"kubernetes.io/projected/9d83df3b-5c19-4bed-9a40-06f23afde5a9-kube-api-access-856mt\") pod \"cert-manager-cainjector-cf98fcc89-b4mhg\" (UID: \"9d83df3b-5c19-4bed-9a40-06f23afde5a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.195639 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpwfs\" (UniqueName: \"kubernetes.io/projected/1a3c27a9-b6a9-4971-b250-6b34d6528e5a-kube-api-access-bpwfs\") pod \"cert-manager-webhook-687f57d79b-z5xqc\" (UID: \"1a3c27a9-b6a9-4971-b250-6b34d6528e5a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.295930 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-856mt\" (UniqueName: \"kubernetes.io/projected/9d83df3b-5c19-4bed-9a40-06f23afde5a9-kube-api-access-856mt\") pod \"cert-manager-cainjector-cf98fcc89-b4mhg\" (UID: \"9d83df3b-5c19-4bed-9a40-06f23afde5a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.296186 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpwfs\" (UniqueName: \"kubernetes.io/projected/1a3c27a9-b6a9-4971-b250-6b34d6528e5a-kube-api-access-bpwfs\") pod \"cert-manager-webhook-687f57d79b-z5xqc\" (UID: \"1a3c27a9-b6a9-4971-b250-6b34d6528e5a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.296236 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcg5h\" (UniqueName: \"kubernetes.io/projected/496f8c34-1261-4c09-9e7e-fa69c23cca44-kube-api-access-fcg5h\") pod \"cert-manager-858654f9db-vsrc9\" (UID: \"496f8c34-1261-4c09-9e7e-fa69c23cca44\") " pod="cert-manager/cert-manager-858654f9db-vsrc9" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.313950 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpwfs\" (UniqueName: \"kubernetes.io/projected/1a3c27a9-b6a9-4971-b250-6b34d6528e5a-kube-api-access-bpwfs\") pod \"cert-manager-webhook-687f57d79b-z5xqc\" (UID: \"1a3c27a9-b6a9-4971-b250-6b34d6528e5a\") " pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.314116 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-856mt\" (UniqueName: \"kubernetes.io/projected/9d83df3b-5c19-4bed-9a40-06f23afde5a9-kube-api-access-856mt\") pod \"cert-manager-cainjector-cf98fcc89-b4mhg\" (UID: \"9d83df3b-5c19-4bed-9a40-06f23afde5a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.314341 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcg5h\" (UniqueName: \"kubernetes.io/projected/496f8c34-1261-4c09-9e7e-fa69c23cca44-kube-api-access-fcg5h\") pod \"cert-manager-858654f9db-vsrc9\" (UID: \"496f8c34-1261-4c09-9e7e-fa69c23cca44\") " pod="cert-manager/cert-manager-858654f9db-vsrc9" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.429288 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.438866 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-vsrc9" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.449518 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.874901 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg"] Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.879138 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-vsrc9"] Jan 30 21:27:24 crc kubenswrapper[4834]: W0130 21:27:24.882431 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod496f8c34_1261_4c09_9e7e_fa69c23cca44.slice/crio-1580827c4e6562fe401ff90f02a49cf4f80d0199dce7cd43dce0c5ee3c539661 WatchSource:0}: Error finding container 1580827c4e6562fe401ff90f02a49cf4f80d0199dce7cd43dce0c5ee3c539661: Status 404 returned error can't find the container with id 1580827c4e6562fe401ff90f02a49cf4f80d0199dce7cd43dce0c5ee3c539661 Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.882938 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:27:24 crc kubenswrapper[4834]: I0130 21:27:24.945992 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-z5xqc"] Jan 30 21:27:24 crc kubenswrapper[4834]: W0130 21:27:24.955172 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3c27a9_b6a9_4971_b250_6b34d6528e5a.slice/crio-de3ab2896a513ffaf05744fe679e0da45e84e82ac6ad08f666a715e7a4480be9 WatchSource:0}: Error finding container de3ab2896a513ffaf05744fe679e0da45e84e82ac6ad08f666a715e7a4480be9: Status 404 returned error can't find the container with id de3ab2896a513ffaf05744fe679e0da45e84e82ac6ad08f666a715e7a4480be9 Jan 30 21:27:25 crc kubenswrapper[4834]: I0130 21:27:25.139117 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" event={"ID":"9d83df3b-5c19-4bed-9a40-06f23afde5a9","Type":"ContainerStarted","Data":"4184187e4131b2fa2e4e7918572fd13850c17d1fa742843cc523e545a4a0d0df"} Jan 30 21:27:25 crc kubenswrapper[4834]: I0130 21:27:25.140607 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" event={"ID":"1a3c27a9-b6a9-4971-b250-6b34d6528e5a","Type":"ContainerStarted","Data":"de3ab2896a513ffaf05744fe679e0da45e84e82ac6ad08f666a715e7a4480be9"} Jan 30 21:27:25 crc kubenswrapper[4834]: I0130 21:27:25.142134 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-vsrc9" event={"ID":"496f8c34-1261-4c09-9e7e-fa69c23cca44","Type":"ContainerStarted","Data":"1580827c4e6562fe401ff90f02a49cf4f80d0199dce7cd43dce0c5ee3c539661"} Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.169549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" event={"ID":"9d83df3b-5c19-4bed-9a40-06f23afde5a9","Type":"ContainerStarted","Data":"e9c332643366d42241eec85f839b106b072ae0f112fbce9e133d0b956bc6dd3e"} Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.171767 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" event={"ID":"1a3c27a9-b6a9-4971-b250-6b34d6528e5a","Type":"ContainerStarted","Data":"35f3624e1a1be74bb078d399d2849cb62eedc0b87c44800b9912d328d6100c46"} Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.172048 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.174526 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-vsrc9" event={"ID":"496f8c34-1261-4c09-9e7e-fa69c23cca44","Type":"ContainerStarted","Data":"4b169c7d371da4b5a50d1f7dc1f4aa9ead8d2951094a709f4201e4f0b7fff6c0"} Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.199542 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mhg" podStartSLOduration=1.70256389 podStartE2EDuration="5.19951117s" podCreationTimestamp="2026-01-30 21:27:24 +0000 UTC" firstStartedPulling="2026-01-30 21:27:24.882714745 +0000 UTC m=+696.035860893" lastFinishedPulling="2026-01-30 21:27:28.379662035 +0000 UTC m=+699.532808173" observedRunningTime="2026-01-30 21:27:29.192027879 +0000 UTC m=+700.345174047" watchObservedRunningTime="2026-01-30 21:27:29.19951117 +0000 UTC m=+700.352657348" Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.241234 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-vsrc9" podStartSLOduration=1.7448926359999999 podStartE2EDuration="5.24121862s" podCreationTimestamp="2026-01-30 21:27:24 +0000 UTC" firstStartedPulling="2026-01-30 21:27:24.884262353 +0000 UTC m=+696.037408501" lastFinishedPulling="2026-01-30 21:27:28.380588347 +0000 UTC m=+699.533734485" observedRunningTime="2026-01-30 21:27:29.239179911 +0000 UTC m=+700.392326059" watchObservedRunningTime="2026-01-30 21:27:29.24121862 +0000 UTC m=+700.394364768" Jan 30 21:27:29 crc kubenswrapper[4834]: I0130 21:27:29.273618 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" podStartSLOduration=1.727393943 podStartE2EDuration="5.273590064s" podCreationTimestamp="2026-01-30 21:27:24 +0000 UTC" firstStartedPulling="2026-01-30 21:27:24.95807717 +0000 UTC m=+696.111223318" lastFinishedPulling="2026-01-30 21:27:28.504273301 +0000 UTC m=+699.657419439" observedRunningTime="2026-01-30 21:27:29.265785875 +0000 UTC m=+700.418932033" watchObservedRunningTime="2026-01-30 21:27:29.273590064 +0000 UTC m=+700.426736242" Jan 30 21:27:34 crc kubenswrapper[4834]: I0130 21:27:34.452697 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-z5xqc" Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.992974 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xmxm"] Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994198 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-controller" containerID="cri-o://ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1" gracePeriod=30 Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994352 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27" gracePeriod=30 Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994321 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="sbdb" containerID="cri-o://bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" gracePeriod=30 Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994455 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-acl-logging" containerID="cri-o://55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2" gracePeriod=30 Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994423 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="northd" containerID="cri-o://f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a" gracePeriod=30 Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994467 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-node" containerID="cri-o://3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5" gracePeriod=30 Jan 30 21:27:40 crc kubenswrapper[4834]: I0130 21:27:40.994264 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="nbdb" containerID="cri-o://2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" gracePeriod=30 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.061239 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" containerID="cri-o://25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094" gracePeriod=30 Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.087886 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.089533 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.089639 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.091147 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.091387 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.091451 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="sbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.092426 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.092472 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="nbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.269474 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/2.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.270334 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/1.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.270441 4834 generic.go:334] "Generic (PLEG): container finished" podID="25f6f1cd-cd4b-475a-85a3-4e81cda5d203" containerID="1d1bb595b13953c5708f441831dee60e73b95ed8a4ad7deae34f1eb003a5eb62" exitCode=2 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.270528 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerDied","Data":"1d1bb595b13953c5708f441831dee60e73b95ed8a4ad7deae34f1eb003a5eb62"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.270587 4834 scope.go:117] "RemoveContainer" containerID="280dbea89a1ce891a4af9a326c75a34f13283acce5e635528e3207c0ee569349" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.272311 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/3.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.272490 4834 scope.go:117] "RemoveContainer" containerID="1d1bb595b13953c5708f441831dee60e73b95ed8a4ad7deae34f1eb003a5eb62" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.273021 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5655f_openshift-multus(25f6f1cd-cd4b-475a-85a3-4e81cda5d203)\"" pod="openshift-multus/multus-5655f" podUID="25f6f1cd-cd4b-475a-85a3-4e81cda5d203" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.274241 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovn-acl-logging/0.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.274776 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovn-controller/0.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275164 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094" exitCode=0 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275182 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" exitCode=0 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275189 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27" exitCode=0 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275196 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5" exitCode=0 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275204 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2" exitCode=143 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275210 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1" exitCode=143 Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275223 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275238 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275248 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275256 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275265 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.275273 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1"} Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.368592 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovnkube-controller/3.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.371843 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovn-acl-logging/0.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.372337 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovn-controller/0.log" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.372738 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.376671 4834 scope.go:117] "RemoveContainer" containerID="b16574877d18206c74eaeea49bf271a357749d6baaca54b271df611ea173fe7d" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.437953 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4ttfn"] Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438326 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438361 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438506 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438523 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438538 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438555 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438573 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="nbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438585 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="nbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438605 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="sbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438616 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="sbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438631 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438643 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438656 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kubecfg-setup" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438668 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kubecfg-setup" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438681 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-acl-logging" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438694 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-acl-logging" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438720 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-node" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438732 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-node" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438747 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438760 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438781 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438793 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.438811 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="northd" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438824 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="northd" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.438995 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439011 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439028 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="northd" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439044 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-ovn-metrics" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439062 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="kube-rbac-proxy-node" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439079 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439095 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439116 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="sbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439132 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439145 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="nbdb" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439163 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovn-acl-logging" Jan 30 21:27:41 crc kubenswrapper[4834]: E0130 21:27:41.439332 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439347 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.439558 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1133254b-8923-414d-8031-4dfe81f17e12" containerName="ovnkube-controller" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.442594 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500512 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-systemd-units\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500575 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-ovn-kubernetes\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500618 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-config\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500649 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-ovn\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500666 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-netns\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500693 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-script-lib\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-node-log\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500750 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500757 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-env-overrides\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500818 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-slash\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500851 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1133254b-8923-414d-8031-4dfe81f17e12-ovn-node-metrics-cert\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500850 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500878 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500963 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-systemd\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500895 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501003 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-bin\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500904 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500923 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-slash" (OuterVolumeSpecName: "host-slash") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.500885 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-node-log" (OuterVolumeSpecName: "node-log") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501040 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-openvswitch\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501047 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501062 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501074 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-var-lib-openvswitch\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501093 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501098 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501125 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-kubelet\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501156 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-etc-openvswitch\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501181 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501200 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkdbm\" (UniqueName: \"kubernetes.io/projected/1133254b-8923-414d-8031-4dfe81f17e12-kube-api-access-qkdbm\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501221 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501249 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-log-socket\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501278 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-netd\") pod \"1133254b-8923-414d-8031-4dfe81f17e12\" (UID: \"1133254b-8923-414d-8031-4dfe81f17e12\") " Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501255 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501304 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-log-socket" (OuterVolumeSpecName: "log-socket") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501329 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501416 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501574 4834 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501589 4834 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501599 4834 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501608 4834 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501616 4834 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501623 4834 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501631 4834 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-log-socket\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501656 4834 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501664 4834 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501673 4834 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501680 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501688 4834 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501696 4834 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501704 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501713 4834 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-node-log\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501736 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1133254b-8923-414d-8031-4dfe81f17e12-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.501744 4834 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-host-slash\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.506024 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1133254b-8923-414d-8031-4dfe81f17e12-kube-api-access-qkdbm" (OuterVolumeSpecName: "kube-api-access-qkdbm") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "kube-api-access-qkdbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.507603 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1133254b-8923-414d-8031-4dfe81f17e12-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.521766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1133254b-8923-414d-8031-4dfe81f17e12" (UID: "1133254b-8923-414d-8031-4dfe81f17e12"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.602831 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-var-lib-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.602897 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-systemd-units\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.602933 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-cni-netd\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.602958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-env-overrides\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603030 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6msxq\" (UniqueName: \"kubernetes.io/projected/f3101e8b-3f74-4441-958c-986666910830-kube-api-access-6msxq\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-cni-bin\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-node-log\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603114 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-systemd\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-etc-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3101e8b-3f74-4441-958c-986666910830-ovn-node-metrics-cert\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-ovnkube-config\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603320 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-slash\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603347 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-run-netns\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603405 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-kubelet\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603455 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-ovnkube-script-lib\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603488 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-ovn\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603624 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-log-socket\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603752 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1133254b-8923-414d-8031-4dfe81f17e12-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603774 4834 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1133254b-8923-414d-8031-4dfe81f17e12-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.603792 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkdbm\" (UniqueName: \"kubernetes.io/projected/1133254b-8923-414d-8031-4dfe81f17e12-kube-api-access-qkdbm\") on node \"crc\" DevicePath \"\"" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705379 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-kubelet\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705474 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705514 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-kubelet\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705540 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-ovnkube-script-lib\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705570 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-ovn\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705605 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-log-socket\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705642 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-run-ovn-kubernetes\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705605 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-ovn\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705754 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705758 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-log-socket\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-var-lib-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705838 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-systemd-units\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-cni-netd\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-var-lib-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705911 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-env-overrides\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705959 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6msxq\" (UniqueName: \"kubernetes.io/projected/f3101e8b-3f74-4441-958c-986666910830-kube-api-access-6msxq\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705993 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-cni-bin\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706012 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-cni-netd\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706024 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-node-log\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.705959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-systemd-units\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706078 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-node-log\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-systemd\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706124 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-etc-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706134 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-cni-bin\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706155 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3101e8b-3f74-4441-958c-986666910830-ovn-node-metrics-cert\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706179 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-ovnkube-config\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706181 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-run-systemd\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706202 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-slash\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706228 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-run-netns\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706337 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-run-netns\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706498 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-host-slash\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3101e8b-3f74-4441-958c-986666910830-etc-openvswitch\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706757 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-env-overrides\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.706887 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-ovnkube-script-lib\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.707047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3101e8b-3f74-4441-958c-986666910830-ovnkube-config\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.710952 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3101e8b-3f74-4441-958c-986666910830-ovn-node-metrics-cert\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.736729 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6msxq\" (UniqueName: \"kubernetes.io/projected/f3101e8b-3f74-4441-958c-986666910830-kube-api-access-6msxq\") pod \"ovnkube-node-4ttfn\" (UID: \"f3101e8b-3f74-4441-958c-986666910830\") " pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:41 crc kubenswrapper[4834]: I0130 21:27:41.762600 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.285951 4834 generic.go:334] "Generic (PLEG): container finished" podID="f3101e8b-3f74-4441-958c-986666910830" containerID="7b7f618f36c120181c80a0a0e40da46d8139926c6d1aab6a90997bb59dfa3f89" exitCode=0 Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.286112 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerDied","Data":"7b7f618f36c120181c80a0a0e40da46d8139926c6d1aab6a90997bb59dfa3f89"} Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.286687 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"ebf52dd7841496e24b8fc81ebfb8c0f8769843b6be0f564a02567ad418490462"} Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.295078 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovn-acl-logging/0.log" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.296303 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xmxm_1133254b-8923-414d-8031-4dfe81f17e12/ovn-controller/0.log" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297189 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" exitCode=0 Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297243 4834 generic.go:334] "Generic (PLEG): container finished" podID="1133254b-8923-414d-8031-4dfe81f17e12" containerID="f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a" exitCode=0 Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297274 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297363 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8"} Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297499 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a"} Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xmxm" event={"ID":"1133254b-8923-414d-8031-4dfe81f17e12","Type":"ContainerDied","Data":"c955e224eef7499b97cb949ce5bdc2b397a08fdd06657708bd68499e884b1908"} Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.297575 4834 scope.go:117] "RemoveContainer" containerID="25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.303334 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/2.log" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.324754 4834 scope.go:117] "RemoveContainer" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.368908 4834 scope.go:117] "RemoveContainer" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.390881 4834 scope.go:117] "RemoveContainer" containerID="f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.398913 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xmxm"] Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.405008 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4xmxm"] Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.412824 4834 scope.go:117] "RemoveContainer" containerID="b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.426300 4834 scope.go:117] "RemoveContainer" containerID="3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.439885 4834 scope.go:117] "RemoveContainer" containerID="55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.450741 4834 scope.go:117] "RemoveContainer" containerID="ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.466997 4834 scope.go:117] "RemoveContainer" containerID="e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.483487 4834 scope.go:117] "RemoveContainer" containerID="25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.483926 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094\": container with ID starting with 25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094 not found: ID does not exist" containerID="25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.483975 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094"} err="failed to get container status \"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094\": rpc error: code = NotFound desc = could not find container \"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094\": container with ID starting with 25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.483996 4834 scope.go:117] "RemoveContainer" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.484296 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\": container with ID starting with bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c not found: ID does not exist" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.484328 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c"} err="failed to get container status \"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\": rpc error: code = NotFound desc = could not find container \"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\": container with ID starting with bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.484349 4834 scope.go:117] "RemoveContainer" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.484823 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\": container with ID starting with 2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8 not found: ID does not exist" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.484918 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8"} err="failed to get container status \"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\": rpc error: code = NotFound desc = could not find container \"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\": container with ID starting with 2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.484994 4834 scope.go:117] "RemoveContainer" containerID="f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.485319 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\": container with ID starting with f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a not found: ID does not exist" containerID="f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.485351 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a"} err="failed to get container status \"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\": rpc error: code = NotFound desc = could not find container \"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\": container with ID starting with f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.485375 4834 scope.go:117] "RemoveContainer" containerID="b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.485728 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\": container with ID starting with b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27 not found: ID does not exist" containerID="b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.485800 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27"} err="failed to get container status \"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\": rpc error: code = NotFound desc = could not find container \"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\": container with ID starting with b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.485893 4834 scope.go:117] "RemoveContainer" containerID="3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.486279 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\": container with ID starting with 3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5 not found: ID does not exist" containerID="3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.486353 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5"} err="failed to get container status \"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\": rpc error: code = NotFound desc = could not find container \"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\": container with ID starting with 3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.486440 4834 scope.go:117] "RemoveContainer" containerID="55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.486786 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\": container with ID starting with 55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2 not found: ID does not exist" containerID="55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.486860 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2"} err="failed to get container status \"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\": rpc error: code = NotFound desc = could not find container \"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\": container with ID starting with 55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.486915 4834 scope.go:117] "RemoveContainer" containerID="ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.487190 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\": container with ID starting with ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1 not found: ID does not exist" containerID="ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.487225 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1"} err="failed to get container status \"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\": rpc error: code = NotFound desc = could not find container \"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\": container with ID starting with ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.487245 4834 scope.go:117] "RemoveContainer" containerID="e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3" Jan 30 21:27:42 crc kubenswrapper[4834]: E0130 21:27:42.487571 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\": container with ID starting with e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3 not found: ID does not exist" containerID="e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.487650 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3"} err="failed to get container status \"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\": rpc error: code = NotFound desc = could not find container \"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\": container with ID starting with e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.487714 4834 scope.go:117] "RemoveContainer" containerID="25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.488030 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094"} err="failed to get container status \"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094\": rpc error: code = NotFound desc = could not find container \"25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094\": container with ID starting with 25c7e047588b6fe90a2860626d990a41aafc7b99b96d9fca6e9a8b488d580094 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.488069 4834 scope.go:117] "RemoveContainer" containerID="bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.488349 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c"} err="failed to get container status \"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\": rpc error: code = NotFound desc = could not find container \"bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c\": container with ID starting with bece562961910dcf51e6461ecf25eb7322ace02fba22767abf20655f2280781c not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.488447 4834 scope.go:117] "RemoveContainer" containerID="2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.488717 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8"} err="failed to get container status \"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\": rpc error: code = NotFound desc = could not find container \"2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8\": container with ID starting with 2fd565c2539fd241e594475e696858c67229ec0ec2efeed15150d7f123f1dce8 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.488787 4834 scope.go:117] "RemoveContainer" containerID="f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.489277 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a"} err="failed to get container status \"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\": rpc error: code = NotFound desc = could not find container \"f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a\": container with ID starting with f662fcbf0284837e6075a4bb894e994533b6f91883ed63fc92503eac935d895a not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.489301 4834 scope.go:117] "RemoveContainer" containerID="b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.489602 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27"} err="failed to get container status \"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\": rpc error: code = NotFound desc = could not find container \"b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27\": container with ID starting with b55b6e6b8104bfd0934778430e99a56817306b4ffc9d805275199cd16b4a3f27 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.489637 4834 scope.go:117] "RemoveContainer" containerID="3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.489935 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5"} err="failed to get container status \"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\": rpc error: code = NotFound desc = could not find container \"3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5\": container with ID starting with 3858cc3bec7e268c54e4e5ad7fcc96c809372310218174e392567a5d7ef049f5 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.489963 4834 scope.go:117] "RemoveContainer" containerID="55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.490380 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2"} err="failed to get container status \"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\": rpc error: code = NotFound desc = could not find container \"55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2\": container with ID starting with 55deca7a97a6b21602e7fd368db0c77abb1e91fa3768aa55d8b6fdfdee310fa2 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.490473 4834 scope.go:117] "RemoveContainer" containerID="ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.490753 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1"} err="failed to get container status \"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\": rpc error: code = NotFound desc = could not find container \"ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1\": container with ID starting with ebf8d64589327a006a95a862007d446910aefba8dde6e2689056c5580b6c9ff1 not found: ID does not exist" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.490837 4834 scope.go:117] "RemoveContainer" containerID="e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3" Jan 30 21:27:42 crc kubenswrapper[4834]: I0130 21:27:42.491163 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3"} err="failed to get container status \"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\": rpc error: code = NotFound desc = could not find container \"e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3\": container with ID starting with e7f754fd972786527a1f8c4311d8a85e39580958acce884c4b3d2643b2846ca3 not found: ID does not exist" Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.321553 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"483fb84065723bb15a19ccb7557ac6668a4bee79b7fa9aa1b7d3a74ade6b8c98"} Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.322102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"ffccb1a85840fc9ae91b83030a30ab60c45b39c714828ce1dba0a630ec538f91"} Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.322146 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"b5492f1467dc8f4224013cb0eaed258bab782d4cd2b3999eabd4971e22650c18"} Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.322175 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"abcb0ed5fb90ea7aad3830229071cb6704ee6c0f595f7a1201304439f06ccab0"} Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.322199 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"ab1c82ac61d97ede22f88baaa4f04f150c3876090e3735a7fc64b7fab1648e5f"} Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.322232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"2fa6d5520b9f6f57ad60e289d7497d0de9860786a9ad1ba86177b302359b6099"} Jan 30 21:27:43 crc kubenswrapper[4834]: I0130 21:27:43.545990 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1133254b-8923-414d-8031-4dfe81f17e12" path="/var/lib/kubelet/pods/1133254b-8923-414d-8031-4dfe81f17e12/volumes" Jan 30 21:27:46 crc kubenswrapper[4834]: I0130 21:27:46.357068 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"1674497691bc5c8c3e1b3caf837e64f9fc97f5d4aad00969db1a328ddf92ab7e"} Jan 30 21:27:48 crc kubenswrapper[4834]: I0130 21:27:48.372473 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" event={"ID":"f3101e8b-3f74-4441-958c-986666910830","Type":"ContainerStarted","Data":"78e72461d4f520b90a0f06cff10ee6448930c3b358f6c354e3cbe6369e7cea29"} Jan 30 21:27:48 crc kubenswrapper[4834]: I0130 21:27:48.372875 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:48 crc kubenswrapper[4834]: I0130 21:27:48.372890 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:48 crc kubenswrapper[4834]: I0130 21:27:48.425910 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:48 crc kubenswrapper[4834]: I0130 21:27:48.432107 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" podStartSLOduration=7.432087358 podStartE2EDuration="7.432087358s" podCreationTimestamp="2026-01-30 21:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:27:48.424933684 +0000 UTC m=+719.578079832" watchObservedRunningTime="2026-01-30 21:27:48.432087358 +0000 UTC m=+719.585233496" Jan 30 21:27:49 crc kubenswrapper[4834]: I0130 21:27:49.380014 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:49 crc kubenswrapper[4834]: I0130 21:27:49.412669 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:27:49 crc kubenswrapper[4834]: I0130 21:27:49.832095 4834 scope.go:117] "RemoveContainer" containerID="865626ac050ce7df0acccd1f7912e52a7e2a48dc3f5cf7b1b3239d78c8e00636" Jan 30 21:27:51 crc kubenswrapper[4834]: I0130 21:27:51.531040 4834 scope.go:117] "RemoveContainer" containerID="1d1bb595b13953c5708f441831dee60e73b95ed8a4ad7deae34f1eb003a5eb62" Jan 30 21:27:51 crc kubenswrapper[4834]: E0130 21:27:51.531512 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5655f_openshift-multus(25f6f1cd-cd4b-475a-85a3-4e81cda5d203)\"" pod="openshift-multus/multus-5655f" podUID="25f6f1cd-cd4b-475a-85a3-4e81cda5d203" Jan 30 21:27:57 crc kubenswrapper[4834]: I0130 21:27:57.911460 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95"] Jan 30 21:27:57 crc kubenswrapper[4834]: I0130 21:27:57.913950 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:57 crc kubenswrapper[4834]: I0130 21:27:57.918011 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:27:57 crc kubenswrapper[4834]: I0130 21:27:57.932129 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95"] Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.042414 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.042479 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.042523 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx4jj\" (UniqueName: \"kubernetes.io/projected/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-kube-api-access-wx4jj\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.057951 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks"] Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.058923 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.066883 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks"] Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.143467 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.143589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.143708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx4jj\" (UniqueName: \"kubernetes.io/projected/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-kube-api-access-wx4jj\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.144645 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.144656 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.170083 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx4jj\" (UniqueName: \"kubernetes.io/projected/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-kube-api-access-wx4jj\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.237822 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.245067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-894kk\" (UniqueName: \"kubernetes.io/projected/4b974467-c941-4dd3-86f1-e9757bce2972-kube-api-access-894kk\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.245135 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.245189 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.261754 4834 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(c21806db27a64763698830ce3c4e0205da3bd1885cbab7618c5fc2c1d5eaf845): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.261981 4834 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(c21806db27a64763698830ce3c4e0205da3bd1885cbab7618c5fc2c1d5eaf845): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.262124 4834 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(c21806db27a64763698830ce3c4e0205da3bd1885cbab7618c5fc2c1d5eaf845): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.262310 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace(9f92c84d-9cef-44b9-a0c5-61e83cfbdf79)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace(9f92c84d-9cef-44b9-a0c5-61e83cfbdf79)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(c21806db27a64763698830ce3c4e0205da3bd1885cbab7618c5fc2c1d5eaf845): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.346535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.346836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-894kk\" (UniqueName: \"kubernetes.io/projected/4b974467-c941-4dd3-86f1-e9757bce2972-kube-api-access-894kk\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.346930 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.347280 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.347664 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.376070 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-894kk\" (UniqueName: \"kubernetes.io/projected/4b974467-c941-4dd3-86f1-e9757bce2972-kube-api-access-894kk\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.434907 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.435843 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.477310 4834 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(ff9af2f24b2f56f286a7bc3c9b40116d95aaef91a303ee323250bda9549bdf35): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.477370 4834 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(ff9af2f24b2f56f286a7bc3c9b40116d95aaef91a303ee323250bda9549bdf35): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.477422 4834 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(ff9af2f24b2f56f286a7bc3c9b40116d95aaef91a303ee323250bda9549bdf35): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.477490 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace(9f92c84d-9cef-44b9-a0c5-61e83cfbdf79)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace(9f92c84d-9cef-44b9-a0c5-61e83cfbdf79)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_openshift-marketplace_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79_0(ff9af2f24b2f56f286a7bc3c9b40116d95aaef91a303ee323250bda9549bdf35): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" Jan 30 21:27:58 crc kubenswrapper[4834]: I0130 21:27:58.671960 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.705282 4834 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(a2934e5938185c8f1b7de84e272398f19ebe167ca40660837660b801791878c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.705368 4834 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(a2934e5938185c8f1b7de84e272398f19ebe167ca40660837660b801791878c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.705418 4834 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(a2934e5938185c8f1b7de84e272398f19ebe167ca40660837660b801791878c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:58 crc kubenswrapper[4834]: E0130 21:27:58.705479 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace(4b974467-c941-4dd3-86f1-e9757bce2972)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace(4b974467-c941-4dd3-86f1-e9757bce2972)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(a2934e5938185c8f1b7de84e272398f19ebe167ca40660837660b801791878c0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" Jan 30 21:27:59 crc kubenswrapper[4834]: I0130 21:27:59.441343 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:59 crc kubenswrapper[4834]: I0130 21:27:59.442382 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:59 crc kubenswrapper[4834]: E0130 21:27:59.477542 4834 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(bc03138cc50570aaf1eda33697372a6a54314cb9cbd036e4dcd6ba5a9fe0aeb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 30 21:27:59 crc kubenswrapper[4834]: E0130 21:27:59.477624 4834 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(bc03138cc50570aaf1eda33697372a6a54314cb9cbd036e4dcd6ba5a9fe0aeb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:59 crc kubenswrapper[4834]: E0130 21:27:59.477660 4834 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(bc03138cc50570aaf1eda33697372a6a54314cb9cbd036e4dcd6ba5a9fe0aeb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:27:59 crc kubenswrapper[4834]: E0130 21:27:59.477733 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace(4b974467-c941-4dd3-86f1-e9757bce2972)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace(4b974467-c941-4dd3-86f1-e9757bce2972)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_openshift-marketplace_4b974467-c941-4dd3-86f1-e9757bce2972_0(bc03138cc50570aaf1eda33697372a6a54314cb9cbd036e4dcd6ba5a9fe0aeb5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" Jan 30 21:28:04 crc kubenswrapper[4834]: I0130 21:28:04.161462 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:28:04 crc kubenswrapper[4834]: I0130 21:28:04.161819 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:28:06 crc kubenswrapper[4834]: I0130 21:28:06.531741 4834 scope.go:117] "RemoveContainer" containerID="1d1bb595b13953c5708f441831dee60e73b95ed8a4ad7deae34f1eb003a5eb62" Jan 30 21:28:07 crc kubenswrapper[4834]: I0130 21:28:07.499896 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5655f_25f6f1cd-cd4b-475a-85a3-4e81cda5d203/kube-multus/2.log" Jan 30 21:28:07 crc kubenswrapper[4834]: I0130 21:28:07.500172 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5655f" event={"ID":"25f6f1cd-cd4b-475a-85a3-4e81cda5d203","Type":"ContainerStarted","Data":"cafac85868db64214660c24bd1c37489c3efd703348d610b5609ce776c88b823"} Jan 30 21:28:11 crc kubenswrapper[4834]: I0130 21:28:11.530815 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:28:11 crc kubenswrapper[4834]: I0130 21:28:11.532126 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:28:11 crc kubenswrapper[4834]: I0130 21:28:11.795040 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4ttfn" Jan 30 21:28:11 crc kubenswrapper[4834]: I0130 21:28:11.872697 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95"] Jan 30 21:28:12 crc kubenswrapper[4834]: I0130 21:28:12.531610 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" event={"ID":"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79","Type":"ContainerStarted","Data":"46701ea985213561fe430f2adc5a753abdbf14de4b14c0c08d969016877c7388"} Jan 30 21:28:12 crc kubenswrapper[4834]: I0130 21:28:12.532203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" event={"ID":"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79","Type":"ContainerStarted","Data":"09e093dbf8c07bdf5d5dd4946230de1cdf087278523c855d5d6f00b07ccddbc8"} Jan 30 21:28:13 crc kubenswrapper[4834]: I0130 21:28:13.530822 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:28:13 crc kubenswrapper[4834]: I0130 21:28:13.531954 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:28:13 crc kubenswrapper[4834]: I0130 21:28:13.785791 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks"] Jan 30 21:28:14 crc kubenswrapper[4834]: I0130 21:28:14.547327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" event={"ID":"4b974467-c941-4dd3-86f1-e9757bce2972","Type":"ContainerStarted","Data":"73e5185164a620cc99cbcb0429e1af1bc284c84113b7cc9134110ea8c4498a34"} Jan 30 21:28:15 crc kubenswrapper[4834]: I0130 21:28:15.554052 4834 generic.go:334] "Generic (PLEG): container finished" podID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerID="46701ea985213561fe430f2adc5a753abdbf14de4b14c0c08d969016877c7388" exitCode=0 Jan 30 21:28:15 crc kubenswrapper[4834]: I0130 21:28:15.554119 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" event={"ID":"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79","Type":"ContainerDied","Data":"46701ea985213561fe430f2adc5a753abdbf14de4b14c0c08d969016877c7388"} Jan 30 21:28:15 crc kubenswrapper[4834]: I0130 21:28:15.556851 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" event={"ID":"4b974467-c941-4dd3-86f1-e9757bce2972","Type":"ContainerDied","Data":"e5173c3dd314c692f7b34ef8cf0160ae8ae7bfc48756bbf3e3804d94bc7bff1d"} Jan 30 21:28:15 crc kubenswrapper[4834]: I0130 21:28:15.556712 4834 generic.go:334] "Generic (PLEG): container finished" podID="4b974467-c941-4dd3-86f1-e9757bce2972" containerID="e5173c3dd314c692f7b34ef8cf0160ae8ae7bfc48756bbf3e3804d94bc7bff1d" exitCode=0 Jan 30 21:28:17 crc kubenswrapper[4834]: I0130 21:28:17.572720 4834 generic.go:334] "Generic (PLEG): container finished" podID="4b974467-c941-4dd3-86f1-e9757bce2972" containerID="3dc78b9cfa1dc77eb3d3f10d271f04b98e1c5d6e73cb14f07e58ce077f9fd2e8" exitCode=0 Jan 30 21:28:17 crc kubenswrapper[4834]: I0130 21:28:17.572773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" event={"ID":"4b974467-c941-4dd3-86f1-e9757bce2972","Type":"ContainerDied","Data":"3dc78b9cfa1dc77eb3d3f10d271f04b98e1c5d6e73cb14f07e58ce077f9fd2e8"} Jan 30 21:28:17 crc kubenswrapper[4834]: I0130 21:28:17.577070 4834 generic.go:334] "Generic (PLEG): container finished" podID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerID="311b13d77d86b60f32c6aee668a737b44ff2a0034a0cef1d99d10ea01d552756" exitCode=0 Jan 30 21:28:17 crc kubenswrapper[4834]: I0130 21:28:17.577120 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" event={"ID":"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79","Type":"ContainerDied","Data":"311b13d77d86b60f32c6aee668a737b44ff2a0034a0cef1d99d10ea01d552756"} Jan 30 21:28:18 crc kubenswrapper[4834]: I0130 21:28:18.593288 4834 generic.go:334] "Generic (PLEG): container finished" podID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerID="a5a7c5a343c06178707efa4dd643bc4d08ffe103b85720e9ac180547e4bd6297" exitCode=0 Jan 30 21:28:18 crc kubenswrapper[4834]: I0130 21:28:18.593388 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" event={"ID":"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79","Type":"ContainerDied","Data":"a5a7c5a343c06178707efa4dd643bc4d08ffe103b85720e9ac180547e4bd6297"} Jan 30 21:28:18 crc kubenswrapper[4834]: I0130 21:28:18.599533 4834 generic.go:334] "Generic (PLEG): container finished" podID="4b974467-c941-4dd3-86f1-e9757bce2972" containerID="09b0ea7a68f89e0421632f961625dbdbabdf26cbaec6fef5b1c7cbb0df81db07" exitCode=0 Jan 30 21:28:18 crc kubenswrapper[4834]: I0130 21:28:18.599586 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" event={"ID":"4b974467-c941-4dd3-86f1-e9757bce2972","Type":"ContainerDied","Data":"09b0ea7a68f89e0421632f961625dbdbabdf26cbaec6fef5b1c7cbb0df81db07"} Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.927341 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.934554 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.978745 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx4jj\" (UniqueName: \"kubernetes.io/projected/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-kube-api-access-wx4jj\") pod \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.978832 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-util\") pod \"4b974467-c941-4dd3-86f1-e9757bce2972\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.978878 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-bundle\") pod \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.978973 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-util\") pod \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\" (UID: \"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79\") " Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.979079 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-bundle\") pod \"4b974467-c941-4dd3-86f1-e9757bce2972\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.979129 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-894kk\" (UniqueName: \"kubernetes.io/projected/4b974467-c941-4dd3-86f1-e9757bce2972-kube-api-access-894kk\") pod \"4b974467-c941-4dd3-86f1-e9757bce2972\" (UID: \"4b974467-c941-4dd3-86f1-e9757bce2972\") " Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.985760 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-bundle" (OuterVolumeSpecName: "bundle") pod "9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" (UID: "9f92c84d-9cef-44b9-a0c5-61e83cfbdf79"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.986190 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-bundle" (OuterVolumeSpecName: "bundle") pod "4b974467-c941-4dd3-86f1-e9757bce2972" (UID: "4b974467-c941-4dd3-86f1-e9757bce2972"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.989116 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-kube-api-access-wx4jj" (OuterVolumeSpecName: "kube-api-access-wx4jj") pod "9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" (UID: "9f92c84d-9cef-44b9-a0c5-61e83cfbdf79"). InnerVolumeSpecName "kube-api-access-wx4jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:28:19 crc kubenswrapper[4834]: I0130 21:28:19.989816 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b974467-c941-4dd3-86f1-e9757bce2972-kube-api-access-894kk" (OuterVolumeSpecName: "kube-api-access-894kk") pod "4b974467-c941-4dd3-86f1-e9757bce2972" (UID: "4b974467-c941-4dd3-86f1-e9757bce2972"). InnerVolumeSpecName "kube-api-access-894kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.001561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-util" (OuterVolumeSpecName: "util") pod "9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" (UID: "9f92c84d-9cef-44b9-a0c5-61e83cfbdf79"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.018524 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-util" (OuterVolumeSpecName: "util") pod "4b974467-c941-4dd3-86f1-e9757bce2972" (UID: "4b974467-c941-4dd3-86f1-e9757bce2972"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.080326 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.080839 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-894kk\" (UniqueName: \"kubernetes.io/projected/4b974467-c941-4dd3-86f1-e9757bce2972-kube-api-access-894kk\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.081024 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx4jj\" (UniqueName: \"kubernetes.io/projected/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-kube-api-access-wx4jj\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.081150 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b974467-c941-4dd3-86f1-e9757bce2972-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.081276 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.081387 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f92c84d-9cef-44b9-a0c5-61e83cfbdf79-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.615496 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.615495 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95" event={"ID":"9f92c84d-9cef-44b9-a0c5-61e83cfbdf79","Type":"ContainerDied","Data":"09e093dbf8c07bdf5d5dd4946230de1cdf087278523c855d5d6f00b07ccddbc8"} Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.615659 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e093dbf8c07bdf5d5dd4946230de1cdf087278523c855d5d6f00b07ccddbc8" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.618759 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" event={"ID":"4b974467-c941-4dd3-86f1-e9757bce2972","Type":"ContainerDied","Data":"73e5185164a620cc99cbcb0429e1af1bc284c84113b7cc9134110ea8c4498a34"} Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.618792 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73e5185164a620cc99cbcb0429e1af1bc284c84113b7cc9134110ea8c4498a34" Jan 30 21:28:20 crc kubenswrapper[4834]: I0130 21:28:20.618847 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks" Jan 30 21:28:25 crc kubenswrapper[4834]: I0130 21:28:25.556834 4834 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.352758 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj"] Jan 30 21:28:27 crc kubenswrapper[4834]: E0130 21:28:27.352968 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="util" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.352984 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="util" Jan 30 21:28:27 crc kubenswrapper[4834]: E0130 21:28:27.352999 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="extract" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353007 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="extract" Jan 30 21:28:27 crc kubenswrapper[4834]: E0130 21:28:27.353024 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="extract" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353032 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="extract" Jan 30 21:28:27 crc kubenswrapper[4834]: E0130 21:28:27.353042 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="pull" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353048 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="pull" Jan 30 21:28:27 crc kubenswrapper[4834]: E0130 21:28:27.353057 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="pull" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353062 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="pull" Jan 30 21:28:27 crc kubenswrapper[4834]: E0130 21:28:27.353071 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="util" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353077 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="util" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353178 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b974467-c941-4dd3-86f1-e9757bce2972" containerName="extract" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353187 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f92c84d-9cef-44b9-a0c5-61e83cfbdf79" containerName="extract" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.353554 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.355112 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.355130 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-xnthb" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.355254 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.362599 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj"] Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.474047 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbqlg\" (UniqueName: \"kubernetes.io/projected/8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae-kube-api-access-fbqlg\") pod \"cluster-logging-operator-79cf69ddc8-4ttdj\" (UID: \"8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.575487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbqlg\" (UniqueName: \"kubernetes.io/projected/8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae-kube-api-access-fbqlg\") pod \"cluster-logging-operator-79cf69ddc8-4ttdj\" (UID: \"8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.595899 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbqlg\" (UniqueName: \"kubernetes.io/projected/8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae-kube-api-access-fbqlg\") pod \"cluster-logging-operator-79cf69ddc8-4ttdj\" (UID: \"8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.667430 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" Jan 30 21:28:27 crc kubenswrapper[4834]: I0130 21:28:27.949061 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj"] Jan 30 21:28:28 crc kubenswrapper[4834]: I0130 21:28:28.668516 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" event={"ID":"8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae","Type":"ContainerStarted","Data":"6382e4d3a5d8e5520cf92d3097e4120f7a3748ca7090f1032c9fb8da26f934d3"} Jan 30 21:28:33 crc kubenswrapper[4834]: I0130 21:28:33.703712 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" event={"ID":"8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae","Type":"ContainerStarted","Data":"619cdf04c5dbb64f35f82d62df2611ace75673e0cdbf1f6c488914590fd19ee7"} Jan 30 21:28:33 crc kubenswrapper[4834]: I0130 21:28:33.722575 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4ttdj" podStartSLOduration=1.328866841 podStartE2EDuration="6.722558291s" podCreationTimestamp="2026-01-30 21:28:27 +0000 UTC" firstStartedPulling="2026-01-30 21:28:27.968724737 +0000 UTC m=+759.121870875" lastFinishedPulling="2026-01-30 21:28:33.362416187 +0000 UTC m=+764.515562325" observedRunningTime="2026-01-30 21:28:33.721232843 +0000 UTC m=+764.874378981" watchObservedRunningTime="2026-01-30 21:28:33.722558291 +0000 UTC m=+764.875704429" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.160595 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.160649 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.724462 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx"] Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.725270 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.727534 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.727682 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.727774 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.728644 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.728755 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-whb5n" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.729302 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.733859 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx"] Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.876054 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzrg\" (UniqueName: \"kubernetes.io/projected/539234c0-ea70-4188-b1d2-e5b758c78563-kube-api-access-4dzrg\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.876108 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.876238 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/539234c0-ea70-4188-b1d2-e5b758c78563-manager-config\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.876289 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-webhook-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.876466 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-apiservice-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.977772 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzrg\" (UniqueName: \"kubernetes.io/projected/539234c0-ea70-4188-b1d2-e5b758c78563-kube-api-access-4dzrg\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.977817 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.977844 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/539234c0-ea70-4188-b1d2-e5b758c78563-manager-config\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.977863 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-webhook-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.977904 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-apiservice-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.979296 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/539234c0-ea70-4188-b1d2-e5b758c78563-manager-config\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.983157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-apiservice-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.983934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:34 crc kubenswrapper[4834]: I0130 21:28:34.988593 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/539234c0-ea70-4188-b1d2-e5b758c78563-webhook-cert\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:35 crc kubenswrapper[4834]: I0130 21:28:35.000168 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzrg\" (UniqueName: \"kubernetes.io/projected/539234c0-ea70-4188-b1d2-e5b758c78563-kube-api-access-4dzrg\") pod \"loki-operator-controller-manager-56cf686fd5-j4wjx\" (UID: \"539234c0-ea70-4188-b1d2-e5b758c78563\") " pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:35 crc kubenswrapper[4834]: I0130 21:28:35.041218 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:35 crc kubenswrapper[4834]: I0130 21:28:35.279148 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx"] Jan 30 21:28:35 crc kubenswrapper[4834]: W0130 21:28:35.289600 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod539234c0_ea70_4188_b1d2_e5b758c78563.slice/crio-b5e1fa7bc8b3c08c1208506a6612c8b942c0333fd3c6f373606f332f085b1110 WatchSource:0}: Error finding container b5e1fa7bc8b3c08c1208506a6612c8b942c0333fd3c6f373606f332f085b1110: Status 404 returned error can't find the container with id b5e1fa7bc8b3c08c1208506a6612c8b942c0333fd3c6f373606f332f085b1110 Jan 30 21:28:35 crc kubenswrapper[4834]: I0130 21:28:35.724696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" event={"ID":"539234c0-ea70-4188-b1d2-e5b758c78563","Type":"ContainerStarted","Data":"b5e1fa7bc8b3c08c1208506a6612c8b942c0333fd3c6f373606f332f085b1110"} Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.704681 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7ggts"] Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.706049 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.726138 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ggts"] Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.755427 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" event={"ID":"539234c0-ea70-4188-b1d2-e5b758c78563","Type":"ContainerStarted","Data":"584b0c1be06357e74f9276ede2272f0c5ddd2b78a58a5a465b7ad6d772cbb33c"} Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.835864 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rjv\" (UniqueName: \"kubernetes.io/projected/a2ecf74e-0aec-464c-b7e7-11670319f04c-kube-api-access-58rjv\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.836162 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ecf74e-0aec-464c-b7e7-11670319f04c-catalog-content\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.836204 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ecf74e-0aec-464c-b7e7-11670319f04c-utilities\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.937166 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rjv\" (UniqueName: \"kubernetes.io/projected/a2ecf74e-0aec-464c-b7e7-11670319f04c-kube-api-access-58rjv\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.937213 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ecf74e-0aec-464c-b7e7-11670319f04c-catalog-content\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.937275 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ecf74e-0aec-464c-b7e7-11670319f04c-utilities\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.937740 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ecf74e-0aec-464c-b7e7-11670319f04c-utilities\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.937835 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ecf74e-0aec-464c-b7e7-11670319f04c-catalog-content\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:39 crc kubenswrapper[4834]: I0130 21:28:39.960978 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rjv\" (UniqueName: \"kubernetes.io/projected/a2ecf74e-0aec-464c-b7e7-11670319f04c-kube-api-access-58rjv\") pod \"certified-operators-7ggts\" (UID: \"a2ecf74e-0aec-464c-b7e7-11670319f04c\") " pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:40 crc kubenswrapper[4834]: I0130 21:28:40.034315 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:40 crc kubenswrapper[4834]: I0130 21:28:40.209770 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ggts"] Jan 30 21:28:40 crc kubenswrapper[4834]: I0130 21:28:40.761881 4834 generic.go:334] "Generic (PLEG): container finished" podID="a2ecf74e-0aec-464c-b7e7-11670319f04c" containerID="905b714631a64981077aea65a97c2699294af7281a9aaf12e37b1c8c9ea698c5" exitCode=0 Jan 30 21:28:40 crc kubenswrapper[4834]: I0130 21:28:40.761934 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggts" event={"ID":"a2ecf74e-0aec-464c-b7e7-11670319f04c","Type":"ContainerDied","Data":"905b714631a64981077aea65a97c2699294af7281a9aaf12e37b1c8c9ea698c5"} Jan 30 21:28:40 crc kubenswrapper[4834]: I0130 21:28:40.761966 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggts" event={"ID":"a2ecf74e-0aec-464c-b7e7-11670319f04c","Type":"ContainerStarted","Data":"ce9ab286bee38bfb4f8c6369b58ad1b95114ba54c773749c2cb2e6b77886be36"} Jan 30 21:28:47 crc kubenswrapper[4834]: I0130 21:28:47.806087 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" event={"ID":"539234c0-ea70-4188-b1d2-e5b758c78563","Type":"ContainerStarted","Data":"ef8ea2a378a1420405a89429b6114766d2f9c07a984062334f6e93075d565f04"} Jan 30 21:28:47 crc kubenswrapper[4834]: I0130 21:28:47.807899 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:47 crc kubenswrapper[4834]: I0130 21:28:47.808913 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" Jan 30 21:28:47 crc kubenswrapper[4834]: I0130 21:28:47.809729 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggts" event={"ID":"a2ecf74e-0aec-464c-b7e7-11670319f04c","Type":"ContainerStarted","Data":"e98727df21b0f6421d19f7e9928dadbfccc8419952c7bed9d0b67514c4e3e303"} Jan 30 21:28:47 crc kubenswrapper[4834]: I0130 21:28:47.853280 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-56cf686fd5-j4wjx" podStartSLOduration=1.653152038 podStartE2EDuration="13.853260495s" podCreationTimestamp="2026-01-30 21:28:34 +0000 UTC" firstStartedPulling="2026-01-30 21:28:35.292601535 +0000 UTC m=+766.445747673" lastFinishedPulling="2026-01-30 21:28:47.492709992 +0000 UTC m=+778.645856130" observedRunningTime="2026-01-30 21:28:47.830883577 +0000 UTC m=+778.984029725" watchObservedRunningTime="2026-01-30 21:28:47.853260495 +0000 UTC m=+779.006406633" Jan 30 21:28:48 crc kubenswrapper[4834]: I0130 21:28:48.819429 4834 generic.go:334] "Generic (PLEG): container finished" podID="a2ecf74e-0aec-464c-b7e7-11670319f04c" containerID="e98727df21b0f6421d19f7e9928dadbfccc8419952c7bed9d0b67514c4e3e303" exitCode=0 Jan 30 21:28:48 crc kubenswrapper[4834]: I0130 21:28:48.821627 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggts" event={"ID":"a2ecf74e-0aec-464c-b7e7-11670319f04c","Type":"ContainerDied","Data":"e98727df21b0f6421d19f7e9928dadbfccc8419952c7bed9d0b67514c4e3e303"} Jan 30 21:28:48 crc kubenswrapper[4834]: I0130 21:28:48.897674 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-49knk"] Jan 30 21:28:48 crc kubenswrapper[4834]: I0130 21:28:48.899442 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:48 crc kubenswrapper[4834]: I0130 21:28:48.913991 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-49knk"] Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.070877 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d29ebe5-230e-468b-8344-bdfa02c88095-utilities\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.071070 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d29ebe5-230e-468b-8344-bdfa02c88095-catalog-content\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.071185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbh4p\" (UniqueName: \"kubernetes.io/projected/7d29ebe5-230e-468b-8344-bdfa02c88095-kube-api-access-gbh4p\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.173133 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbh4p\" (UniqueName: \"kubernetes.io/projected/7d29ebe5-230e-468b-8344-bdfa02c88095-kube-api-access-gbh4p\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.173299 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d29ebe5-230e-468b-8344-bdfa02c88095-utilities\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.173489 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d29ebe5-230e-468b-8344-bdfa02c88095-catalog-content\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.174090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d29ebe5-230e-468b-8344-bdfa02c88095-utilities\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.174222 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d29ebe5-230e-468b-8344-bdfa02c88095-catalog-content\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.197269 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbh4p\" (UniqueName: \"kubernetes.io/projected/7d29ebe5-230e-468b-8344-bdfa02c88095-kube-api-access-gbh4p\") pod \"community-operators-49knk\" (UID: \"7d29ebe5-230e-468b-8344-bdfa02c88095\") " pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.216834 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-49knk" Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.737375 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-49knk"] Jan 30 21:28:49 crc kubenswrapper[4834]: W0130 21:28:49.741087 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d29ebe5_230e_468b_8344_bdfa02c88095.slice/crio-33ca51e6de672eb9a49d8a83350977569d346b90c32508973166e5d015d59f03 WatchSource:0}: Error finding container 33ca51e6de672eb9a49d8a83350977569d346b90c32508973166e5d015d59f03: Status 404 returned error can't find the container with id 33ca51e6de672eb9a49d8a83350977569d346b90c32508973166e5d015d59f03 Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.824561 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49knk" event={"ID":"7d29ebe5-230e-468b-8344-bdfa02c88095","Type":"ContainerStarted","Data":"33ca51e6de672eb9a49d8a83350977569d346b90c32508973166e5d015d59f03"} Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.829738 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggts" event={"ID":"a2ecf74e-0aec-464c-b7e7-11670319f04c","Type":"ContainerStarted","Data":"48ba3984f928c3684111cfaecb44d57687c6702e5b9acd7a270209c8ea719a30"} Jan 30 21:28:49 crc kubenswrapper[4834]: I0130 21:28:49.851720 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7ggts" podStartSLOduration=2.36096537 podStartE2EDuration="10.851702123s" podCreationTimestamp="2026-01-30 21:28:39 +0000 UTC" firstStartedPulling="2026-01-30 21:28:40.763630489 +0000 UTC m=+771.916776627" lastFinishedPulling="2026-01-30 21:28:49.254367242 +0000 UTC m=+780.407513380" observedRunningTime="2026-01-30 21:28:49.848925934 +0000 UTC m=+781.002072062" watchObservedRunningTime="2026-01-30 21:28:49.851702123 +0000 UTC m=+781.004848261" Jan 30 21:28:50 crc kubenswrapper[4834]: I0130 21:28:50.035037 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:50 crc kubenswrapper[4834]: I0130 21:28:50.035295 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:28:50 crc kubenswrapper[4834]: I0130 21:28:50.835981 4834 generic.go:334] "Generic (PLEG): container finished" podID="7d29ebe5-230e-468b-8344-bdfa02c88095" containerID="4756653d51db3cbb69c4ce2c3d280ddd1df5d7b69ee1998e6b112fe1b32636f8" exitCode=0 Jan 30 21:28:50 crc kubenswrapper[4834]: I0130 21:28:50.836043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49knk" event={"ID":"7d29ebe5-230e-468b-8344-bdfa02c88095","Type":"ContainerDied","Data":"4756653d51db3cbb69c4ce2c3d280ddd1df5d7b69ee1998e6b112fe1b32636f8"} Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.083179 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ggts" podUID="a2ecf74e-0aec-464c-b7e7-11670319f04c" containerName="registry-server" probeResult="failure" output=< Jan 30 21:28:51 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:28:51 crc kubenswrapper[4834]: > Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.351104 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.351926 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.356771 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.357040 4834 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-dcmz5" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.359043 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.359797 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.499533 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swr6k\" (UniqueName: \"kubernetes.io/projected/78a4493b-f3e7-4d04-914a-e952c7c5eb09-kube-api-access-swr6k\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") " pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.499595 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-61d99487-c975-4866-92b8-dc0948a7f836\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61d99487-c975-4866-92b8-dc0948a7f836\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") " pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.601303 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-61d99487-c975-4866-92b8-dc0948a7f836\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61d99487-c975-4866-92b8-dc0948a7f836\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") " pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.601743 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swr6k\" (UniqueName: \"kubernetes.io/projected/78a4493b-f3e7-4d04-914a-e952c7c5eb09-kube-api-access-swr6k\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") " pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.608445 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.608481 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-61d99487-c975-4866-92b8-dc0948a7f836\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61d99487-c975-4866-92b8-dc0948a7f836\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7c467ee103fd8bee60609ad27e6dca84a9602a71a1ec91e0a4adb2232bfcc328/globalmount\"" pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.628303 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swr6k\" (UniqueName: \"kubernetes.io/projected/78a4493b-f3e7-4d04-914a-e952c7c5eb09-kube-api-access-swr6k\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") " pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.661225 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-61d99487-c975-4866-92b8-dc0948a7f836\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61d99487-c975-4866-92b8-dc0948a7f836\") pod \"minio\" (UID: \"78a4493b-f3e7-4d04-914a-e952c7c5eb09\") " pod="minio-dev/minio" Jan 30 21:28:51 crc kubenswrapper[4834]: I0130 21:28:51.668175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 30 21:28:52 crc kubenswrapper[4834]: I0130 21:28:52.105877 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 30 21:28:52 crc kubenswrapper[4834]: I0130 21:28:52.847981 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"78a4493b-f3e7-4d04-914a-e952c7c5eb09","Type":"ContainerStarted","Data":"cd923baf5b33cf020c5d255632597b05f6b1406ddce4c39b4206bc6977673bd9"} Jan 30 21:28:59 crc kubenswrapper[4834]: I0130 21:28:59.890411 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49knk" event={"ID":"7d29ebe5-230e-468b-8344-bdfa02c88095","Type":"ContainerStarted","Data":"0f5683daf36cded47ccd7fa39b45e996affaf99fe7957f8543dfe3f320d74b94"} Jan 30 21:28:59 crc kubenswrapper[4834]: I0130 21:28:59.891621 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"78a4493b-f3e7-4d04-914a-e952c7c5eb09","Type":"ContainerStarted","Data":"08caf699e6dadde59347aa023942da95877a346fe10dfa182cd30d3a4bf03211"} Jan 30 21:29:00 crc kubenswrapper[4834]: I0130 21:29:00.080514 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:29:00 crc kubenswrapper[4834]: I0130 21:29:00.120816 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7ggts" Jan 30 21:29:00 crc kubenswrapper[4834]: I0130 21:29:00.902093 4834 generic.go:334] "Generic (PLEG): container finished" podID="7d29ebe5-230e-468b-8344-bdfa02c88095" containerID="0f5683daf36cded47ccd7fa39b45e996affaf99fe7957f8543dfe3f320d74b94" exitCode=0 Jan 30 21:29:00 crc kubenswrapper[4834]: I0130 21:29:00.902503 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49knk" event={"ID":"7d29ebe5-230e-468b-8344-bdfa02c88095","Type":"ContainerDied","Data":"0f5683daf36cded47ccd7fa39b45e996affaf99fe7957f8543dfe3f320d74b94"} Jan 30 21:29:00 crc kubenswrapper[4834]: I0130 21:29:00.943144 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.591067846 podStartE2EDuration="11.943129797s" podCreationTimestamp="2026-01-30 21:28:49 +0000 UTC" firstStartedPulling="2026-01-30 21:28:52.116530787 +0000 UTC m=+783.269676925" lastFinishedPulling="2026-01-30 21:28:59.468592738 +0000 UTC m=+790.621738876" observedRunningTime="2026-01-30 21:29:00.940195783 +0000 UTC m=+792.093341961" watchObservedRunningTime="2026-01-30 21:29:00.943129797 +0000 UTC m=+792.096275935" Jan 30 21:29:00 crc kubenswrapper[4834]: I0130 21:29:00.967070 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ggts"] Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.324644 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmwt6"] Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.324918 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xmwt6" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="registry-server" containerID="cri-o://244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce" gracePeriod=2 Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.733853 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.862054 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-utilities\") pod \"1421282d-913d-46ae-b270-fdde46df72a3\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.862161 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggfxd\" (UniqueName: \"kubernetes.io/projected/1421282d-913d-46ae-b270-fdde46df72a3-kube-api-access-ggfxd\") pod \"1421282d-913d-46ae-b270-fdde46df72a3\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.862198 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-catalog-content\") pod \"1421282d-913d-46ae-b270-fdde46df72a3\" (UID: \"1421282d-913d-46ae-b270-fdde46df72a3\") " Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.863025 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-utilities" (OuterVolumeSpecName: "utilities") pod "1421282d-913d-46ae-b270-fdde46df72a3" (UID: "1421282d-913d-46ae-b270-fdde46df72a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.873614 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1421282d-913d-46ae-b270-fdde46df72a3-kube-api-access-ggfxd" (OuterVolumeSpecName: "kube-api-access-ggfxd") pod "1421282d-913d-46ae-b270-fdde46df72a3" (UID: "1421282d-913d-46ae-b270-fdde46df72a3"). InnerVolumeSpecName "kube-api-access-ggfxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.908890 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-49knk" event={"ID":"7d29ebe5-230e-468b-8344-bdfa02c88095","Type":"ContainerStarted","Data":"8053b0d5c9b9e620714f99a42ae70fa1f9bef7693e4ca52f9da8e210dacd0d4e"} Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.910997 4834 generic.go:334] "Generic (PLEG): container finished" podID="1421282d-913d-46ae-b270-fdde46df72a3" containerID="244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce" exitCode=0 Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.911072 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmwt6" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.911072 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerDied","Data":"244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce"} Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.911132 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmwt6" event={"ID":"1421282d-913d-46ae-b270-fdde46df72a3","Type":"ContainerDied","Data":"a1f6021aa7635ea2902feae3de727d0deb6c8f2e0c87ef5f2d4ce9c167550fa2"} Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.911155 4834 scope.go:117] "RemoveContainer" containerID="244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.918094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1421282d-913d-46ae-b270-fdde46df72a3" (UID: "1421282d-913d-46ae-b270-fdde46df72a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.927377 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-49knk" podStartSLOduration=3.450396548 podStartE2EDuration="13.927362734s" podCreationTimestamp="2026-01-30 21:28:48 +0000 UTC" firstStartedPulling="2026-01-30 21:28:50.837757142 +0000 UTC m=+781.990903280" lastFinishedPulling="2026-01-30 21:29:01.314723328 +0000 UTC m=+792.467869466" observedRunningTime="2026-01-30 21:29:01.924324328 +0000 UTC m=+793.077470466" watchObservedRunningTime="2026-01-30 21:29:01.927362734 +0000 UTC m=+793.080508872" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.928638 4834 scope.go:117] "RemoveContainer" containerID="39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.942918 4834 scope.go:117] "RemoveContainer" containerID="f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.957062 4834 scope.go:117] "RemoveContainer" containerID="244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce" Jan 30 21:29:01 crc kubenswrapper[4834]: E0130 21:29:01.957414 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce\": container with ID starting with 244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce not found: ID does not exist" containerID="244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.957450 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce"} err="failed to get container status \"244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce\": rpc error: code = NotFound desc = could not find container \"244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce\": container with ID starting with 244f593e783cbb1d72eab99a4e9bd8a629ce5e87eff60723ecfdd4c0c16939ce not found: ID does not exist" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.957474 4834 scope.go:117] "RemoveContainer" containerID="39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c" Jan 30 21:29:01 crc kubenswrapper[4834]: E0130 21:29:01.957782 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c\": container with ID starting with 39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c not found: ID does not exist" containerID="39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.957806 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c"} err="failed to get container status \"39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c\": rpc error: code = NotFound desc = could not find container \"39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c\": container with ID starting with 39d163b4f621042c90199a9882aa3e0724214d9b7fcc17f7f9807ee5b9478f5c not found: ID does not exist" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.957824 4834 scope.go:117] "RemoveContainer" containerID="f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9" Jan 30 21:29:01 crc kubenswrapper[4834]: E0130 21:29:01.958123 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9\": container with ID starting with f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9 not found: ID does not exist" containerID="f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.958146 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9"} err="failed to get container status \"f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9\": rpc error: code = NotFound desc = could not find container \"f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9\": container with ID starting with f7a0f51a1f70b1d1f045cdc22c2ff3e3ad98ec0c00f9e52468f1e35de3be09f9 not found: ID does not exist" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.963983 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggfxd\" (UniqueName: \"kubernetes.io/projected/1421282d-913d-46ae-b270-fdde46df72a3-kube-api-access-ggfxd\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.964007 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:01 crc kubenswrapper[4834]: I0130 21:29:01.964019 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1421282d-913d-46ae-b270-fdde46df72a3-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:02 crc kubenswrapper[4834]: I0130 21:29:02.235666 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmwt6"] Jan 30 21:29:02 crc kubenswrapper[4834]: I0130 21:29:02.239854 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xmwt6"] Jan 30 21:29:03 crc kubenswrapper[4834]: I0130 21:29:03.537039 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1421282d-913d-46ae-b270-fdde46df72a3" path="/var/lib/kubelet/pods/1421282d-913d-46ae-b270-fdde46df72a3/volumes" Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.161130 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.161500 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.161567 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.163779 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76c0ed7f9e9f321f65e1f6d65b4089c7729795a844b8db9bc32d4d6eeeb8f6b8"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.163900 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://76c0ed7f9e9f321f65e1f6d65b4089c7729795a844b8db9bc32d4d6eeeb8f6b8" gracePeriod=600 Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.931882 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="76c0ed7f9e9f321f65e1f6d65b4089c7729795a844b8db9bc32d4d6eeeb8f6b8" exitCode=0 Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.931988 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"76c0ed7f9e9f321f65e1f6d65b4089c7729795a844b8db9bc32d4d6eeeb8f6b8"} Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.932221 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"4cb5a4bc85d48be6eae743481c416969d22a0b71074f3b32022fe8457dc9c32b"} Jan 30 21:29:04 crc kubenswrapper[4834]: I0130 21:29:04.932243 4834 scope.go:117] "RemoveContainer" containerID="8ec49418611dac2916cceb2c6e5d860a3177886a0b42237393b287ec87bc697b" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.254014 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l"] Jan 30 21:29:06 crc kubenswrapper[4834]: E0130 21:29:06.255066 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="extract-content" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.255086 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="extract-content" Jan 30 21:29:06 crc kubenswrapper[4834]: E0130 21:29:06.255110 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="extract-utilities" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.255120 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="extract-utilities" Jan 30 21:29:06 crc kubenswrapper[4834]: E0130 21:29:06.255138 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="registry-server" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.255146 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="registry-server" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.255271 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1421282d-913d-46ae-b270-fdde46df72a3" containerName="registry-server" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.260238 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.263337 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.263498 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.263701 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.264192 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-xxm8b" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.266100 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.276212 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.403387 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-tcqxt"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.404238 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.406434 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.406741 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.406986 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.420111 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-tcqxt"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.424104 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.424173 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.424213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3470b03a-ff5f-4654-b8f4-db3ee90be448-config\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.424235 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.424257 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mhm9\" (UniqueName: \"kubernetes.io/projected/3470b03a-ff5f-4654-b8f4-db3ee90be448-kube-api-access-9mhm9\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.459017 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.460030 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.462180 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.469585 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.475133 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525153 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9b72\" (UniqueName: \"kubernetes.io/projected/5fc9974d-5ec1-42b1-a557-2601e6168fa1-kube-api-access-g9b72\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525196 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525220 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525242 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525270 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525291 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-s3\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525311 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3470b03a-ff5f-4654-b8f4-db3ee90be448-config\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525329 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mhm9\" (UniqueName: \"kubernetes.io/projected/3470b03a-ff5f-4654-b8f4-db3ee90be448-kube-api-access-9mhm9\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fc9974d-5ec1-42b1-a557-2601e6168fa1-config\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.525383 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.526141 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.527714 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3470b03a-ff5f-4654-b8f4-db3ee90be448-config\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.531480 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.533833 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/3470b03a-ff5f-4654-b8f4-db3ee90be448-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.548789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mhm9\" (UniqueName: \"kubernetes.io/projected/3470b03a-ff5f-4654-b8f4-db3ee90be448-kube-api-access-9mhm9\") pod \"logging-loki-distributor-5f678c8dd6-7m47l\" (UID: \"3470b03a-ff5f-4654-b8f4-db3ee90be448\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.549540 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.550379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.557380 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.557665 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.557876 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.558195 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.563820 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.567655 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.569426 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.570314 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.581096 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.584067 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626513 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626565 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626604 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-s3\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626623 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626646 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tenants\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626662 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-rbac\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626682 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032e9188-65b1-4456-9879-518958f9c1e7-config\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626721 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fc9974d-5ec1-42b1-a557-2601e6168fa1-config\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626755 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626773 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrl9\" (UniqueName: \"kubernetes.io/projected/032e9188-65b1-4456-9879-518958f9c1e7-kube-api-access-9mrl9\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626792 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626818 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626846 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9b72\" (UniqueName: \"kubernetes.io/projected/5fc9974d-5ec1-42b1-a557-2601e6168fa1-kube-api-access-g9b72\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626897 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626918 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bd5\" (UniqueName: \"kubernetes.io/projected/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-kube-api-access-94bd5\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.626956 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-lokistack-gateway\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.629008 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fc9974d-5ec1-42b1-a557-2601e6168fa1-config\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.629684 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.631208 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-s3\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.632788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.633257 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5fc9974d-5ec1-42b1-a557-2601e6168fa1-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.654243 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9b72\" (UniqueName: \"kubernetes.io/projected/5fc9974d-5ec1-42b1-a557-2601e6168fa1-kube-api-access-g9b72\") pod \"logging-loki-querier-76788598db-tcqxt\" (UID: \"5fc9974d-5ec1-42b1-a557-2601e6168fa1\") " pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.718463 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94bd5\" (UniqueName: \"kubernetes.io/projected/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-kube-api-access-94bd5\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727792 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-tenants\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727834 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-lokistack-gateway\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727872 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727887 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-rbac\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727902 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-lokistack-gateway\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727920 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727963 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727982 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.727999 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tenants\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728013 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-rbac\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032e9188-65b1-4456-9879-518958f9c1e7-config\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728052 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrl9\" (UniqueName: \"kubernetes.io/projected/032e9188-65b1-4456-9879-518958f9c1e7-kube-api-access-9mrl9\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728110 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728129 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728146 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728176 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s74ht\" (UniqueName: \"kubernetes.io/projected/2ed32950-7326-4344-bcdb-7843ca0162e1-kube-api-access-s74ht\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.728596 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.729895 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: E0130 21:29:06.730286 4834 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Jan 30 21:29:06 crc kubenswrapper[4834]: E0130 21:29:06.730462 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tls-secret podName:a0779eb6-e6eb-41f3-8c01-8072ad63eedd nodeName:}" failed. No retries permitted until 2026-01-30 21:29:07.230357577 +0000 UTC m=+798.383503765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tls-secret") pod "logging-loki-gateway-5d9fb787f7-m7ft4" (UID: "a0779eb6-e6eb-41f3-8c01-8072ad63eedd") : secret "logging-loki-gateway-http" not found Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.730668 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-rbac\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.732269 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.732527 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/032e9188-65b1-4456-9879-518958f9c1e7-config\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.733676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.734125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tenants\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.734477 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-lokistack-gateway\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.734692 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.737352 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/032e9188-65b1-4456-9879-518958f9c1e7-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.750324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bd5\" (UniqueName: \"kubernetes.io/projected/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-kube-api-access-94bd5\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.753210 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrl9\" (UniqueName: \"kubernetes.io/projected/032e9188-65b1-4456-9879-518958f9c1e7-kube-api-access-9mrl9\") pod \"logging-loki-query-frontend-69d9546745-tcdrc\" (UID: \"032e9188-65b1-4456-9879-518958f9c1e7\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.781712 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834168 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-rbac\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834251 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-lokistack-gateway\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834301 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834628 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834697 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s74ht\" (UniqueName: \"kubernetes.io/projected/2ed32950-7326-4344-bcdb-7843ca0162e1-kube-api-access-s74ht\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.834733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-tenants\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.836155 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.837106 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-lokistack-gateway\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.837269 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-rbac\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.838113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.838620 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.842526 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.843232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2ed32950-7326-4344-bcdb-7843ca0162e1-tenants\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.857054 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s74ht\" (UniqueName: \"kubernetes.io/projected/2ed32950-7326-4344-bcdb-7843ca0162e1-kube-api-access-s74ht\") pod \"logging-loki-gateway-5d9fb787f7-rkvxh\" (UID: \"2ed32950-7326-4344-bcdb-7843ca0162e1\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.915930 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-tcqxt"] Jan 30 21:29:06 crc kubenswrapper[4834]: I0130 21:29:06.967367 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:06 crc kubenswrapper[4834]: W0130 21:29:06.970600 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fc9974d_5ec1_42b1_a557_2601e6168fa1.slice/crio-f698fefafcc02f80661feba03661f8e7fde0d68e5f4e4a8077e4b312de3a6555 WatchSource:0}: Error finding container f698fefafcc02f80661feba03661f8e7fde0d68e5f4e4a8077e4b312de3a6555: Status 404 returned error can't find the container with id f698fefafcc02f80661feba03661f8e7fde0d68e5f4e4a8077e4b312de3a6555 Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.017114 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l"] Jan 30 21:29:07 crc kubenswrapper[4834]: W0130 21:29:07.020781 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3470b03a_ff5f_4654_b8f4_db3ee90be448.slice/crio-8c808012a6a5a400e59e4c1c787c3061f056cc3e16e8518a6cd562d1f389732e WatchSource:0}: Error finding container 8c808012a6a5a400e59e4c1c787c3061f056cc3e16e8518a6cd562d1f389732e: Status 404 returned error can't find the container with id 8c808012a6a5a400e59e4c1c787c3061f056cc3e16e8518a6cd562d1f389732e Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.156495 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.239547 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.245331 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a0779eb6-e6eb-41f3-8c01-8072ad63eedd-tls-secret\") pod \"logging-loki-gateway-5d9fb787f7-m7ft4\" (UID: \"a0779eb6-e6eb-41f3-8c01-8072ad63eedd\") " pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.254479 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc"] Jan 30 21:29:07 crc kubenswrapper[4834]: W0130 21:29:07.255981 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032e9188_65b1_4456_9879_518958f9c1e7.slice/crio-7079d4abf02bf9370798f32e14393b41e53c90f20b37ba0d5e7d64b3d58cfb9a WatchSource:0}: Error finding container 7079d4abf02bf9370798f32e14393b41e53c90f20b37ba0d5e7d64b3d58cfb9a: Status 404 returned error can't find the container with id 7079d4abf02bf9370798f32e14393b41e53c90f20b37ba0d5e7d64b3d58cfb9a Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.392613 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.393317 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.395486 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.395559 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.417581 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.455347 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.456749 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.460467 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.460710 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.462412 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.500273 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.514520 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.515798 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.522163 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.522357 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.522633 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.548560 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.548789 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hrqv\" (UniqueName: \"kubernetes.io/projected/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-kube-api-access-6hrqv\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.548914 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549026 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549142 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549224 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zs5t\" (UniqueName: \"kubernetes.io/projected/21cb37c2-74d7-4840-9248-4330f12ead7a-kube-api-access-5zs5t\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549328 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549452 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549580 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549702 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549823 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.549934 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.550057 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-config\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.550142 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cb37c2-74d7-4840-9248-4330f12ead7a-config\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.550222 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.655670 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6def372-844d-4aa5-9499-915742a71d36-config\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657168 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657205 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657253 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657278 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hrqv\" (UniqueName: \"kubernetes.io/projected/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-kube-api-access-6hrqv\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657318 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657334 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657354 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657385 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657427 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zs5t\" (UniqueName: \"kubernetes.io/projected/21cb37c2-74d7-4840-9248-4330f12ead7a-kube-api-access-5zs5t\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657480 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657549 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk5mm\" (UniqueName: \"kubernetes.io/projected/d6def372-844d-4aa5-9499-915742a71d36-kube-api-access-qk5mm\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657573 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657616 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657646 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657676 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657718 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-config\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657795 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cb37c2-74d7-4840-9248-4330f12ead7a-config\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.657815 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.659431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.659575 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-config\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.660589 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cb37c2-74d7-4840-9248-4330f12ead7a-config\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.661730 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.661750 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3044f717d0c049fff29aadf8c29f52b3206f5ea070b27e14c78f81e2af4e4ba4/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.662166 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.662207 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2f44592cd069afb64f93cc4c8945f560a878400e1f1c4f71173b1c454d0b7784/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.663385 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.664103 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.664132 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/38f2d4d0b918a367613a9ce6b7f45594fbf3275904457502f2eb45d8b7ee1618/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.666382 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.666822 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.670884 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.671298 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/21cb37c2-74d7-4840-9248-4330f12ead7a-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.674619 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.677382 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zs5t\" (UniqueName: \"kubernetes.io/projected/21cb37c2-74d7-4840-9248-4330f12ead7a-kube-api-access-5zs5t\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.678170 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.678978 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hrqv\" (UniqueName: \"kubernetes.io/projected/e0bb0cb6-3429-4d3e-ac99-162bb485aa1b-kube-api-access-6hrqv\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.687235 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4090a0c9-1b3a-47e5-96a2-703fa6edcd5b\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.691820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e15edf1f-8cbf-4e1c-9c48-f531dd05584c\") pod \"logging-loki-ingester-0\" (UID: \"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b\") " pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.707890 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3302a0dd-90f7-496b-a261-930a2a18ffaa\") pod \"logging-loki-compactor-0\" (UID: \"21cb37c2-74d7-4840-9248-4330f12ead7a\") " pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.728803 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.758585 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6def372-844d-4aa5-9499-915742a71d36-config\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.758625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.758650 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.758669 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.759766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.758685 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.759824 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6def372-844d-4aa5-9499-915742a71d36-config\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.759879 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk5mm\" (UniqueName: \"kubernetes.io/projected/d6def372-844d-4aa5-9499-915742a71d36-kube-api-access-qk5mm\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.760045 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.763264 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.763717 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.763766 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dda1742f9b82af28c6171dd85b7f9637a3d00cab9e24d95b0e0ecabe89c1a3bf/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.764117 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.766171 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d6def372-844d-4aa5-9499-915742a71d36-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.778745 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk5mm\" (UniqueName: \"kubernetes.io/projected/d6def372-844d-4aa5-9499-915742a71d36-kube-api-access-qk5mm\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.785666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e6d7cf1-a0c4-4398-8021-324c323825e7\") pod \"logging-loki-index-gateway-0\" (UID: \"d6def372-844d-4aa5-9499-915742a71d36\") " pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.788409 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.886609 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.960458 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4"] Jan 30 21:29:07 crc kubenswrapper[4834]: W0130 21:29:07.971524 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0779eb6_e6eb_41f3_8c01_8072ad63eedd.slice/crio-a5487386321757f531de85f75485c0c82e6fcc39ffb74b5246fe35701e08c032 WatchSource:0}: Error finding container a5487386321757f531de85f75485c0c82e6fcc39ffb74b5246fe35701e08c032: Status 404 returned error can't find the container with id a5487386321757f531de85f75485c0c82e6fcc39ffb74b5246fe35701e08c032 Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.977209 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.979383 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" event={"ID":"5fc9974d-5ec1-42b1-a557-2601e6168fa1","Type":"ContainerStarted","Data":"f698fefafcc02f80661feba03661f8e7fde0d68e5f4e4a8077e4b312de3a6555"} Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.980207 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" event={"ID":"2ed32950-7326-4344-bcdb-7843ca0162e1","Type":"ContainerStarted","Data":"4498a704c941ad92e6f499159705a2939c6ad767c124a1570bd20593603f7902"} Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.981219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" event={"ID":"032e9188-65b1-4456-9879-518958f9c1e7","Type":"ContainerStarted","Data":"7079d4abf02bf9370798f32e14393b41e53c90f20b37ba0d5e7d64b3d58cfb9a"} Jan 30 21:29:07 crc kubenswrapper[4834]: I0130 21:29:07.982726 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" event={"ID":"3470b03a-ff5f-4654-b8f4-db3ee90be448","Type":"ContainerStarted","Data":"8c808012a6a5a400e59e4c1c787c3061f056cc3e16e8518a6cd562d1f389732e"} Jan 30 21:29:08 crc kubenswrapper[4834]: W0130 21:29:08.030890 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21cb37c2_74d7_4840_9248_4330f12ead7a.slice/crio-5cb74d0d3dc38ea6069600a1fc0c7bcecd17035137419dda4ac46f923b891ca9 WatchSource:0}: Error finding container 5cb74d0d3dc38ea6069600a1fc0c7bcecd17035137419dda4ac46f923b891ca9: Status 404 returned error can't find the container with id 5cb74d0d3dc38ea6069600a1fc0c7bcecd17035137419dda4ac46f923b891ca9 Jan 30 21:29:08 crc kubenswrapper[4834]: I0130 21:29:08.122909 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Jan 30 21:29:08 crc kubenswrapper[4834]: I0130 21:29:08.359784 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Jan 30 21:29:08 crc kubenswrapper[4834]: W0130 21:29:08.372509 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6def372_844d_4aa5_9499_915742a71d36.slice/crio-956b9409dff6dacc06971d892f7205d12c78ea66801273c9b57c6c3dabb51fff WatchSource:0}: Error finding container 956b9409dff6dacc06971d892f7205d12c78ea66801273c9b57c6c3dabb51fff: Status 404 returned error can't find the container with id 956b9409dff6dacc06971d892f7205d12c78ea66801273c9b57c6c3dabb51fff Jan 30 21:29:08 crc kubenswrapper[4834]: I0130 21:29:08.991930 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" event={"ID":"a0779eb6-e6eb-41f3-8c01-8072ad63eedd","Type":"ContainerStarted","Data":"a5487386321757f531de85f75485c0c82e6fcc39ffb74b5246fe35701e08c032"} Jan 30 21:29:08 crc kubenswrapper[4834]: I0130 21:29:08.993906 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"d6def372-844d-4aa5-9499-915742a71d36","Type":"ContainerStarted","Data":"956b9409dff6dacc06971d892f7205d12c78ea66801273c9b57c6c3dabb51fff"} Jan 30 21:29:08 crc kubenswrapper[4834]: I0130 21:29:08.995672 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b","Type":"ContainerStarted","Data":"66819dca308655326fde736b265105771458ac9cebdf9de8ce6f2d98a72c6ca6"} Jan 30 21:29:08 crc kubenswrapper[4834]: I0130 21:29:08.997710 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"21cb37c2-74d7-4840-9248-4330f12ead7a","Type":"ContainerStarted","Data":"5cb74d0d3dc38ea6069600a1fc0c7bcecd17035137419dda4ac46f923b891ca9"} Jan 30 21:29:09 crc kubenswrapper[4834]: I0130 21:29:09.217452 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-49knk" Jan 30 21:29:09 crc kubenswrapper[4834]: I0130 21:29:09.217529 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-49knk" Jan 30 21:29:09 crc kubenswrapper[4834]: I0130 21:29:09.290719 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-49knk" Jan 30 21:29:10 crc kubenswrapper[4834]: I0130 21:29:10.084821 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-49knk" Jan 30 21:29:11 crc kubenswrapper[4834]: I0130 21:29:11.148853 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-49knk"] Jan 30 21:29:11 crc kubenswrapper[4834]: I0130 21:29:11.330713 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-67tqx"] Jan 30 21:29:11 crc kubenswrapper[4834]: I0130 21:29:11.331584 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-67tqx" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="registry-server" containerID="cri-o://bb40727cb8deff0d53a0a84d2aeaef020b5346724a6534ddcd5889d1290fe429" gracePeriod=2 Jan 30 21:29:12 crc kubenswrapper[4834]: I0130 21:29:12.022199 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerID="bb40727cb8deff0d53a0a84d2aeaef020b5346724a6534ddcd5889d1290fe429" exitCode=0 Jan 30 21:29:12 crc kubenswrapper[4834]: I0130 21:29:12.022279 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerDied","Data":"bb40727cb8deff0d53a0a84d2aeaef020b5346724a6534ddcd5889d1290fe429"} Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.313987 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.453455 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpmmp\" (UniqueName: \"kubernetes.io/projected/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-kube-api-access-gpmmp\") pod \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.453514 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-catalog-content\") pod \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.453561 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-utilities\") pod \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\" (UID: \"1a1bc0d2-5126-41ab-9d64-cfbd3707572f\") " Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.455078 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-utilities" (OuterVolumeSpecName: "utilities") pod "1a1bc0d2-5126-41ab-9d64-cfbd3707572f" (UID: "1a1bc0d2-5126-41ab-9d64-cfbd3707572f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.461347 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-kube-api-access-gpmmp" (OuterVolumeSpecName: "kube-api-access-gpmmp") pod "1a1bc0d2-5126-41ab-9d64-cfbd3707572f" (UID: "1a1bc0d2-5126-41ab-9d64-cfbd3707572f"). InnerVolumeSpecName "kube-api-access-gpmmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.530053 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a1bc0d2-5126-41ab-9d64-cfbd3707572f" (UID: "1a1bc0d2-5126-41ab-9d64-cfbd3707572f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.555354 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpmmp\" (UniqueName: \"kubernetes.io/projected/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-kube-api-access-gpmmp\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.555458 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:13 crc kubenswrapper[4834]: I0130 21:29:13.555478 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a1bc0d2-5126-41ab-9d64-cfbd3707572f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:29:14 crc kubenswrapper[4834]: I0130 21:29:14.039810 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-67tqx" event={"ID":"1a1bc0d2-5126-41ab-9d64-cfbd3707572f","Type":"ContainerDied","Data":"14dee4a39ec89321c4b7d256fcf26458a8a13cbd2147ac93ab33076e3a8c87c7"} Jan 30 21:29:14 crc kubenswrapper[4834]: I0130 21:29:14.039867 4834 scope.go:117] "RemoveContainer" containerID="bb40727cb8deff0d53a0a84d2aeaef020b5346724a6534ddcd5889d1290fe429" Jan 30 21:29:14 crc kubenswrapper[4834]: I0130 21:29:14.039911 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-67tqx" Jan 30 21:29:14 crc kubenswrapper[4834]: I0130 21:29:14.064867 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-67tqx"] Jan 30 21:29:14 crc kubenswrapper[4834]: I0130 21:29:14.082490 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-67tqx"] Jan 30 21:29:15 crc kubenswrapper[4834]: I0130 21:29:15.542950 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" path="/var/lib/kubelet/pods/1a1bc0d2-5126-41ab-9d64-cfbd3707572f/volumes" Jan 30 21:29:15 crc kubenswrapper[4834]: I0130 21:29:15.699245 4834 scope.go:117] "RemoveContainer" containerID="a7270e9653808a9ff69cbbf224e0e31cc1d567ce4cc9ef35281292cf1d0d79a0" Jan 30 21:29:16 crc kubenswrapper[4834]: I0130 21:29:16.034303 4834 scope.go:117] "RemoveContainer" containerID="f7c0747ae9e96fba5c6e0704bde24a12243ada90ff27102130762b2e85b28fcd" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.064882 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" event={"ID":"3470b03a-ff5f-4654-b8f4-db3ee90be448","Type":"ContainerStarted","Data":"5cc72134364f0d5b94941177d5a35774b073b5735f936a8f6a34a438aa810514"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.065728 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.067200 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" event={"ID":"5fc9974d-5ec1-42b1-a557-2601e6168fa1","Type":"ContainerStarted","Data":"d8d437757f9a423523d6bd053b38c45c21eac552310f5f501c5c073d93ed325d"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.067432 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.070890 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"d6def372-844d-4aa5-9499-915742a71d36","Type":"ContainerStarted","Data":"edf7a20d550384576e8cc7e2d9bbea5bcb1ff0d3793f88cd3e292c9b65ec610c"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.071068 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.073057 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"e0bb0cb6-3429-4d3e-ac99-162bb485aa1b","Type":"ContainerStarted","Data":"9798a10bea637f1e9eae84ee05a67a8466b382a25c5f4866fce33f68f3f37b46"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.073243 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.075124 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" event={"ID":"a0779eb6-e6eb-41f3-8c01-8072ad63eedd","Type":"ContainerStarted","Data":"2c7fe7251dd0fe7bebe7abe28db60273f228eaeb9ff313caecc5ffab03820a34"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.077422 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" event={"ID":"2ed32950-7326-4344-bcdb-7843ca0162e1","Type":"ContainerStarted","Data":"6601450748f661f9862b3f1d9a29c9a62b216d47e5abbc861f0b18a2e22d6ff9"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.080122 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" event={"ID":"032e9188-65b1-4456-9879-518958f9c1e7","Type":"ContainerStarted","Data":"b81a061852d7d846faf0d7b709ee9552fe5d65572b3f9f103d2fdd72739d9213"} Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.080308 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.102329 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" podStartSLOduration=2.0308889 podStartE2EDuration="11.102306262s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:07.025461279 +0000 UTC m=+798.178607417" lastFinishedPulling="2026-01-30 21:29:16.096878631 +0000 UTC m=+807.250024779" observedRunningTime="2026-01-30 21:29:17.091990719 +0000 UTC m=+808.245136867" watchObservedRunningTime="2026-01-30 21:29:17.102306262 +0000 UTC m=+808.255452400" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.123943 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.3443353829999998 podStartE2EDuration="11.123918158s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:08.374165826 +0000 UTC m=+799.527311964" lastFinishedPulling="2026-01-30 21:29:16.153748591 +0000 UTC m=+807.306894739" observedRunningTime="2026-01-30 21:29:17.119309587 +0000 UTC m=+808.272455765" watchObservedRunningTime="2026-01-30 21:29:17.123918158 +0000 UTC m=+808.277064316" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.155223 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.130299019 podStartE2EDuration="11.155200549s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:08.133362739 +0000 UTC m=+799.286508877" lastFinishedPulling="2026-01-30 21:29:16.158264259 +0000 UTC m=+807.311410407" observedRunningTime="2026-01-30 21:29:17.145795421 +0000 UTC m=+808.298941629" watchObservedRunningTime="2026-01-30 21:29:17.155200549 +0000 UTC m=+808.308346697" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.173483 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" podStartSLOduration=2.014149344 podStartE2EDuration="11.173450538s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:06.973873031 +0000 UTC m=+798.127019169" lastFinishedPulling="2026-01-30 21:29:16.133174195 +0000 UTC m=+807.286320363" observedRunningTime="2026-01-30 21:29:17.167966782 +0000 UTC m=+808.321112930" watchObservedRunningTime="2026-01-30 21:29:17.173450538 +0000 UTC m=+808.326596706" Jan 30 21:29:17 crc kubenswrapper[4834]: I0130 21:29:17.190721 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" podStartSLOduration=2.316546746 podStartE2EDuration="11.19070244s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:07.258439584 +0000 UTC m=+798.411585722" lastFinishedPulling="2026-01-30 21:29:16.132595268 +0000 UTC m=+807.285741416" observedRunningTime="2026-01-30 21:29:17.185329297 +0000 UTC m=+808.338475435" watchObservedRunningTime="2026-01-30 21:29:17.19070244 +0000 UTC m=+808.343848578" Jan 30 21:29:18 crc kubenswrapper[4834]: I0130 21:29:18.088247 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"21cb37c2-74d7-4840-9248-4330f12ead7a","Type":"ContainerStarted","Data":"d84972a5c30cb13dd915ab7c8853af7c1d58a2a362d6607b26ae25ed4728cc8a"} Jan 30 21:29:18 crc kubenswrapper[4834]: I0130 21:29:18.129037 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.738782365 podStartE2EDuration="12.129013359s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:08.034780711 +0000 UTC m=+799.187926849" lastFinishedPulling="2026-01-30 21:29:16.425011705 +0000 UTC m=+807.578157843" observedRunningTime="2026-01-30 21:29:18.1220403 +0000 UTC m=+809.275186468" watchObservedRunningTime="2026-01-30 21:29:18.129013359 +0000 UTC m=+809.282159497" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.097345 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" event={"ID":"a0779eb6-e6eb-41f3-8c01-8072ad63eedd","Type":"ContainerStarted","Data":"240a746aaf2cb157b19cbcb8c14f2671115d1945609d98ba1f4339c843bfc28d"} Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.098083 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.098139 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.101552 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" event={"ID":"2ed32950-7326-4344-bcdb-7843ca0162e1","Type":"ContainerStarted","Data":"b286d64ceeb02cdf0a86cfb030a4deeb8a113f1028adcc40769044e943c0e3f7"} Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.101759 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.101919 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.115023 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.120604 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.120824 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.131749 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-m7ft4" podStartSLOduration=2.546280178 podStartE2EDuration="13.131718663s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:07.977173861 +0000 UTC m=+799.130320009" lastFinishedPulling="2026-01-30 21:29:18.562612346 +0000 UTC m=+809.715758494" observedRunningTime="2026-01-30 21:29:19.131076004 +0000 UTC m=+810.284222172" watchObservedRunningTime="2026-01-30 21:29:19.131718663 +0000 UTC m=+810.284864891" Jan 30 21:29:19 crc kubenswrapper[4834]: I0130 21:29:19.168996 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" podStartSLOduration=1.7592967069999998 podStartE2EDuration="13.168965283s" podCreationTimestamp="2026-01-30 21:29:06 +0000 UTC" firstStartedPulling="2026-01-30 21:29:07.160217247 +0000 UTC m=+798.313363385" lastFinishedPulling="2026-01-30 21:29:18.569885803 +0000 UTC m=+809.723031961" observedRunningTime="2026-01-30 21:29:19.166879784 +0000 UTC m=+810.320025932" watchObservedRunningTime="2026-01-30 21:29:19.168965283 +0000 UTC m=+810.322111451" Jan 30 21:29:20 crc kubenswrapper[4834]: I0130 21:29:20.112007 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:20 crc kubenswrapper[4834]: I0130 21:29:20.131238 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d9fb787f7-rkvxh" Jan 30 21:29:36 crc kubenswrapper[4834]: I0130 21:29:36.598344 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-7m47l" Jan 30 21:29:36 crc kubenswrapper[4834]: I0130 21:29:36.730888 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-tcqxt" Jan 30 21:29:36 crc kubenswrapper[4834]: I0130 21:29:36.799969 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-tcdrc" Jan 30 21:29:37 crc kubenswrapper[4834]: I0130 21:29:37.738174 4834 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Jan 30 21:29:37 crc kubenswrapper[4834]: I0130 21:29:37.738516 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0bb0cb6-3429-4d3e-ac99-162bb485aa1b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:29:37 crc kubenswrapper[4834]: I0130 21:29:37.795475 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Jan 30 21:29:37 crc kubenswrapper[4834]: I0130 21:29:37.892946 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Jan 30 21:29:47 crc kubenswrapper[4834]: I0130 21:29:47.738488 4834 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 30 21:29:47 crc kubenswrapper[4834]: I0130 21:29:47.739193 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0bb0cb6-3429-4d3e-ac99-162bb485aa1b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:29:57 crc kubenswrapper[4834]: I0130 21:29:57.734496 4834 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Jan 30 21:29:57 crc kubenswrapper[4834]: I0130 21:29:57.735257 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="e0bb0cb6-3429-4d3e-ac99-162bb485aa1b" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.173846 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh"] Jan 30 21:30:00 crc kubenswrapper[4834]: E0130 21:30:00.174889 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="extract-content" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.174979 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="extract-content" Jan 30 21:30:00 crc kubenswrapper[4834]: E0130 21:30:00.175060 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="extract-utilities" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.175125 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="extract-utilities" Jan 30 21:30:00 crc kubenswrapper[4834]: E0130 21:30:00.175199 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.175265 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.176849 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a1bc0d2-5126-41ab-9d64-cfbd3707572f" containerName="registry-server" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.177556 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.185507 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.186022 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.190736 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh"] Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.263589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2197ac71-9c5a-483c-9944-518bd37b0583-config-volume\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.263955 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7fp5\" (UniqueName: \"kubernetes.io/projected/2197ac71-9c5a-483c-9944-518bd37b0583-kube-api-access-w7fp5\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.264001 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2197ac71-9c5a-483c-9944-518bd37b0583-secret-volume\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.365294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2197ac71-9c5a-483c-9944-518bd37b0583-config-volume\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.365348 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7fp5\" (UniqueName: \"kubernetes.io/projected/2197ac71-9c5a-483c-9944-518bd37b0583-kube-api-access-w7fp5\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.365373 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2197ac71-9c5a-483c-9944-518bd37b0583-secret-volume\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.367731 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2197ac71-9c5a-483c-9944-518bd37b0583-config-volume\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.377483 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2197ac71-9c5a-483c-9944-518bd37b0583-secret-volume\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.385591 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7fp5\" (UniqueName: \"kubernetes.io/projected/2197ac71-9c5a-483c-9944-518bd37b0583-kube-api-access-w7fp5\") pod \"collect-profiles-29496810-vtkrh\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.458697 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dts2q"] Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.460025 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.466750 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-utilities\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.466911 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-catalog-content\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.467101 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhqkf\" (UniqueName: \"kubernetes.io/projected/cf35dcfb-746c-4f5c-aa0a-853adde1900f-kube-api-access-rhqkf\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.469906 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dts2q"] Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.511762 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.568048 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-catalog-content\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.568128 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhqkf\" (UniqueName: \"kubernetes.io/projected/cf35dcfb-746c-4f5c-aa0a-853adde1900f-kube-api-access-rhqkf\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.568201 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-utilities\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.568494 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-catalog-content\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.568690 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-utilities\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.589548 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhqkf\" (UniqueName: \"kubernetes.io/projected/cf35dcfb-746c-4f5c-aa0a-853adde1900f-kube-api-access-rhqkf\") pod \"redhat-operators-dts2q\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.702818 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh"] Jan 30 21:30:00 crc kubenswrapper[4834]: I0130 21:30:00.787973 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.184986 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dts2q"] Jan 30 21:30:01 crc kubenswrapper[4834]: W0130 21:30:01.189236 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf35dcfb_746c_4f5c_aa0a_853adde1900f.slice/crio-904f0568fc1ed5937ba343e675ed09fef68e56ef7fee950c1663ae9ad75a4461 WatchSource:0}: Error finding container 904f0568fc1ed5937ba343e675ed09fef68e56ef7fee950c1663ae9ad75a4461: Status 404 returned error can't find the container with id 904f0568fc1ed5937ba343e675ed09fef68e56ef7fee950c1663ae9ad75a4461 Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.497503 4834 generic.go:334] "Generic (PLEG): container finished" podID="2197ac71-9c5a-483c-9944-518bd37b0583" containerID="e0ef5120563536268b094d4c5b997bd2a18d963a578fa1a50eeded6edc98285c" exitCode=0 Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.497598 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" event={"ID":"2197ac71-9c5a-483c-9944-518bd37b0583","Type":"ContainerDied","Data":"e0ef5120563536268b094d4c5b997bd2a18d963a578fa1a50eeded6edc98285c"} Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.497658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" event={"ID":"2197ac71-9c5a-483c-9944-518bd37b0583","Type":"ContainerStarted","Data":"e7be2f48123af5945d08c50c83ad60e6343b4ceca2ec2cbec987a22da9047630"} Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.498680 4834 generic.go:334] "Generic (PLEG): container finished" podID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerID="a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2" exitCode=0 Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.498714 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerDied","Data":"a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2"} Jan 30 21:30:01 crc kubenswrapper[4834]: I0130 21:30:01.498742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerStarted","Data":"904f0568fc1ed5937ba343e675ed09fef68e56ef7fee950c1663ae9ad75a4461"} Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.812771 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.897848 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2197ac71-9c5a-483c-9944-518bd37b0583-secret-volume\") pod \"2197ac71-9c5a-483c-9944-518bd37b0583\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.898036 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2197ac71-9c5a-483c-9944-518bd37b0583-config-volume\") pod \"2197ac71-9c5a-483c-9944-518bd37b0583\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.898155 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7fp5\" (UniqueName: \"kubernetes.io/projected/2197ac71-9c5a-483c-9944-518bd37b0583-kube-api-access-w7fp5\") pod \"2197ac71-9c5a-483c-9944-518bd37b0583\" (UID: \"2197ac71-9c5a-483c-9944-518bd37b0583\") " Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.899369 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2197ac71-9c5a-483c-9944-518bd37b0583-config-volume" (OuterVolumeSpecName: "config-volume") pod "2197ac71-9c5a-483c-9944-518bd37b0583" (UID: "2197ac71-9c5a-483c-9944-518bd37b0583"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.903735 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2197ac71-9c5a-483c-9944-518bd37b0583-kube-api-access-w7fp5" (OuterVolumeSpecName: "kube-api-access-w7fp5") pod "2197ac71-9c5a-483c-9944-518bd37b0583" (UID: "2197ac71-9c5a-483c-9944-518bd37b0583"). InnerVolumeSpecName "kube-api-access-w7fp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.906262 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197ac71-9c5a-483c-9944-518bd37b0583-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2197ac71-9c5a-483c-9944-518bd37b0583" (UID: "2197ac71-9c5a-483c-9944-518bd37b0583"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.999618 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7fp5\" (UniqueName: \"kubernetes.io/projected/2197ac71-9c5a-483c-9944-518bd37b0583-kube-api-access-w7fp5\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.999661 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2197ac71-9c5a-483c-9944-518bd37b0583-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:02 crc kubenswrapper[4834]: I0130 21:30:02.999675 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2197ac71-9c5a-483c-9944-518bd37b0583-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:03 crc kubenswrapper[4834]: I0130 21:30:03.516544 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" event={"ID":"2197ac71-9c5a-483c-9944-518bd37b0583","Type":"ContainerDied","Data":"e7be2f48123af5945d08c50c83ad60e6343b4ceca2ec2cbec987a22da9047630"} Jan 30 21:30:03 crc kubenswrapper[4834]: I0130 21:30:03.516581 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7be2f48123af5945d08c50c83ad60e6343b4ceca2ec2cbec987a22da9047630" Jan 30 21:30:03 crc kubenswrapper[4834]: I0130 21:30:03.516659 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh" Jan 30 21:30:04 crc kubenswrapper[4834]: I0130 21:30:04.525598 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerStarted","Data":"e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299"} Jan 30 21:30:05 crc kubenswrapper[4834]: I0130 21:30:05.536950 4834 generic.go:334] "Generic (PLEG): container finished" podID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerID="e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299" exitCode=0 Jan 30 21:30:05 crc kubenswrapper[4834]: I0130 21:30:05.548255 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerDied","Data":"e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299"} Jan 30 21:30:06 crc kubenswrapper[4834]: I0130 21:30:06.547952 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerStarted","Data":"d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a"} Jan 30 21:30:06 crc kubenswrapper[4834]: I0130 21:30:06.570934 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dts2q" podStartSLOduration=2.11342115 podStartE2EDuration="6.570916374s" podCreationTimestamp="2026-01-30 21:30:00 +0000 UTC" firstStartedPulling="2026-01-30 21:30:01.499953948 +0000 UTC m=+852.653100076" lastFinishedPulling="2026-01-30 21:30:05.957449162 +0000 UTC m=+857.110595300" observedRunningTime="2026-01-30 21:30:06.569948376 +0000 UTC m=+857.723094534" watchObservedRunningTime="2026-01-30 21:30:06.570916374 +0000 UTC m=+857.724062512" Jan 30 21:30:07 crc kubenswrapper[4834]: I0130 21:30:07.738696 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Jan 30 21:30:10 crc kubenswrapper[4834]: I0130 21:30:10.788998 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:10 crc kubenswrapper[4834]: I0130 21:30:10.789079 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:11 crc kubenswrapper[4834]: I0130 21:30:11.841888 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dts2q" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="registry-server" probeResult="failure" output=< Jan 30 21:30:11 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:30:11 crc kubenswrapper[4834]: > Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.594758 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jl729"] Jan 30 21:30:16 crc kubenswrapper[4834]: E0130 21:30:16.595353 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2197ac71-9c5a-483c-9944-518bd37b0583" containerName="collect-profiles" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.595368 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197ac71-9c5a-483c-9944-518bd37b0583" containerName="collect-profiles" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.595500 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2197ac71-9c5a-483c-9944-518bd37b0583" containerName="collect-profiles" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.596458 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.648681 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl729"] Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.722258 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-catalog-content\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.722303 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-utilities\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.722372 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tt2\" (UniqueName: \"kubernetes.io/projected/895e1507-aebd-4f35-b07e-2074bda88037-kube-api-access-g2tt2\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.823503 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-utilities\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.823600 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tt2\" (UniqueName: \"kubernetes.io/projected/895e1507-aebd-4f35-b07e-2074bda88037-kube-api-access-g2tt2\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.823660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-catalog-content\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.824095 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-catalog-content\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.824212 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-utilities\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.845577 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tt2\" (UniqueName: \"kubernetes.io/projected/895e1507-aebd-4f35-b07e-2074bda88037-kube-api-access-g2tt2\") pod \"redhat-marketplace-jl729\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:16 crc kubenswrapper[4834]: I0130 21:30:16.930987 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:17 crc kubenswrapper[4834]: I0130 21:30:17.423700 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl729"] Jan 30 21:30:17 crc kubenswrapper[4834]: I0130 21:30:17.660981 4834 generic.go:334] "Generic (PLEG): container finished" podID="895e1507-aebd-4f35-b07e-2074bda88037" containerID="91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe" exitCode=0 Jan 30 21:30:17 crc kubenswrapper[4834]: I0130 21:30:17.661037 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerDied","Data":"91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe"} Jan 30 21:30:17 crc kubenswrapper[4834]: I0130 21:30:17.661088 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerStarted","Data":"d48777dabbe9add62528dfc1a00239583d0270f37319f9590ebcc39b4dce9f0e"} Jan 30 21:30:18 crc kubenswrapper[4834]: I0130 21:30:18.671112 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerStarted","Data":"b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41"} Jan 30 21:30:19 crc kubenswrapper[4834]: I0130 21:30:19.683567 4834 generic.go:334] "Generic (PLEG): container finished" podID="895e1507-aebd-4f35-b07e-2074bda88037" containerID="b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41" exitCode=0 Jan 30 21:30:19 crc kubenswrapper[4834]: I0130 21:30:19.683682 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerDied","Data":"b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41"} Jan 30 21:30:20 crc kubenswrapper[4834]: I0130 21:30:20.706606 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerStarted","Data":"c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6"} Jan 30 21:30:20 crc kubenswrapper[4834]: I0130 21:30:20.731270 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jl729" podStartSLOduration=2.210124061 podStartE2EDuration="4.731254013s" podCreationTimestamp="2026-01-30 21:30:16 +0000 UTC" firstStartedPulling="2026-01-30 21:30:17.663177984 +0000 UTC m=+868.816324172" lastFinishedPulling="2026-01-30 21:30:20.184307986 +0000 UTC m=+871.337454124" observedRunningTime="2026-01-30 21:30:20.727245259 +0000 UTC m=+871.880391437" watchObservedRunningTime="2026-01-30 21:30:20.731254013 +0000 UTC m=+871.884400151" Jan 30 21:30:20 crc kubenswrapper[4834]: I0130 21:30:20.849460 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:20 crc kubenswrapper[4834]: I0130 21:30:20.899411 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:22 crc kubenswrapper[4834]: I0130 21:30:22.973305 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dts2q"] Jan 30 21:30:22 crc kubenswrapper[4834]: I0130 21:30:22.973873 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dts2q" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="registry-server" containerID="cri-o://d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a" gracePeriod=2 Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.437917 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.619239 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhqkf\" (UniqueName: \"kubernetes.io/projected/cf35dcfb-746c-4f5c-aa0a-853adde1900f-kube-api-access-rhqkf\") pod \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.619498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-utilities\") pod \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.619539 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-catalog-content\") pod \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\" (UID: \"cf35dcfb-746c-4f5c-aa0a-853adde1900f\") " Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.621244 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-utilities" (OuterVolumeSpecName: "utilities") pod "cf35dcfb-746c-4f5c-aa0a-853adde1900f" (UID: "cf35dcfb-746c-4f5c-aa0a-853adde1900f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.627726 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf35dcfb-746c-4f5c-aa0a-853adde1900f-kube-api-access-rhqkf" (OuterVolumeSpecName: "kube-api-access-rhqkf") pod "cf35dcfb-746c-4f5c-aa0a-853adde1900f" (UID: "cf35dcfb-746c-4f5c-aa0a-853adde1900f"). InnerVolumeSpecName "kube-api-access-rhqkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.721821 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhqkf\" (UniqueName: \"kubernetes.io/projected/cf35dcfb-746c-4f5c-aa0a-853adde1900f-kube-api-access-rhqkf\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.721857 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.728038 4834 generic.go:334] "Generic (PLEG): container finished" podID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerID="d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a" exitCode=0 Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.728106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerDied","Data":"d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a"} Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.728165 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dts2q" event={"ID":"cf35dcfb-746c-4f5c-aa0a-853adde1900f","Type":"ContainerDied","Data":"904f0568fc1ed5937ba343e675ed09fef68e56ef7fee950c1663ae9ad75a4461"} Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.728115 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dts2q" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.728205 4834 scope.go:117] "RemoveContainer" containerID="d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.737861 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf35dcfb-746c-4f5c-aa0a-853adde1900f" (UID: "cf35dcfb-746c-4f5c-aa0a-853adde1900f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.751540 4834 scope.go:117] "RemoveContainer" containerID="e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.782065 4834 scope.go:117] "RemoveContainer" containerID="a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.808139 4834 scope.go:117] "RemoveContainer" containerID="d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a" Jan 30 21:30:23 crc kubenswrapper[4834]: E0130 21:30:23.808855 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a\": container with ID starting with d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a not found: ID does not exist" containerID="d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.808922 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a"} err="failed to get container status \"d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a\": rpc error: code = NotFound desc = could not find container \"d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a\": container with ID starting with d433329aba90644978562aa21299521e34acd69e3319620336e62d88eabd4c8a not found: ID does not exist" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.808949 4834 scope.go:117] "RemoveContainer" containerID="e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299" Jan 30 21:30:23 crc kubenswrapper[4834]: E0130 21:30:23.809330 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299\": container with ID starting with e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299 not found: ID does not exist" containerID="e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.809377 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299"} err="failed to get container status \"e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299\": rpc error: code = NotFound desc = could not find container \"e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299\": container with ID starting with e2158695b5d2e764e487c90d31ef9e139244d98658cfec673c1163eff3be3299 not found: ID does not exist" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.809448 4834 scope.go:117] "RemoveContainer" containerID="a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2" Jan 30 21:30:23 crc kubenswrapper[4834]: E0130 21:30:23.809923 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2\": container with ID starting with a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2 not found: ID does not exist" containerID="a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.809964 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2"} err="failed to get container status \"a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2\": rpc error: code = NotFound desc = could not find container \"a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2\": container with ID starting with a712f5711c5be5c8e09fc823647cbe37577107317d02d456938e766b528244f2 not found: ID does not exist" Jan 30 21:30:23 crc kubenswrapper[4834]: I0130 21:30:23.822959 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf35dcfb-746c-4f5c-aa0a-853adde1900f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:24 crc kubenswrapper[4834]: I0130 21:30:24.074168 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dts2q"] Jan 30 21:30:24 crc kubenswrapper[4834]: I0130 21:30:24.081297 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dts2q"] Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.546765 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" path="/var/lib/kubelet/pods/cf35dcfb-746c-4f5c-aa0a-853adde1900f/volumes" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.939449 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-n2bmz"] Jan 30 21:30:25 crc kubenswrapper[4834]: E0130 21:30:25.939794 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="registry-server" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.939820 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="registry-server" Jan 30 21:30:25 crc kubenswrapper[4834]: E0130 21:30:25.939854 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="extract-content" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.939868 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="extract-content" Jan 30 21:30:25 crc kubenswrapper[4834]: E0130 21:30:25.939887 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="extract-utilities" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.939902 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="extract-utilities" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.940093 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf35dcfb-746c-4f5c-aa0a-853adde1900f" containerName="registry-server" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.940799 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-n2bmz" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.944578 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.944578 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.945177 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.945272 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-rtbk2" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.945382 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.959554 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 30 21:30:25 crc kubenswrapper[4834]: I0130 21:30:25.972009 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-n2bmz"] Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.022456 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-n2bmz"] Jan 30 21:30:26 crc kubenswrapper[4834]: E0130 21:30:26.023116 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-6tnwl metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-n2bmz" podUID="5a91ae6a-af0f-4a2b-8466-dab0a3152b26" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.052958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tnwl\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-kube-api-access-6tnwl\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-token\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053032 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-datadir\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053050 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-trusted-ca\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053204 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-sa-token\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053348 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-tmp\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053389 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-entrypoint\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053568 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config-openshift-service-cacrt\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-metrics\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053710 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-syslog-receiver\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.053759 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155584 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-trusted-ca\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155633 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-sa-token\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155682 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-tmp\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155700 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-entrypoint\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155729 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config-openshift-service-cacrt\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-metrics\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155775 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-syslog-receiver\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155794 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155819 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tnwl\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-kube-api-access-6tnwl\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-token\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155850 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-datadir\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.155906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-datadir\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.157326 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-trusted-ca\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.157930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.158063 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-entrypoint\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.158273 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config-openshift-service-cacrt\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.161653 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-tmp\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.163639 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-syslog-receiver\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.171658 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-metrics\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.172532 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-token\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.175207 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tnwl\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-kube-api-access-6tnwl\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.176211 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-sa-token\") pod \"collector-n2bmz\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.749651 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.764631 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-n2bmz" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.931519 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.931602 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.965976 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-token\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966043 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-sa-token\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966074 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-tmp\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966107 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-trusted-ca\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966157 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tnwl\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-kube-api-access-6tnwl\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966529 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-metrics\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966579 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966612 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config-openshift-service-cacrt\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966654 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-syslog-receiver\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966680 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-datadir\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.966702 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-entrypoint\") pod \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\" (UID: \"5a91ae6a-af0f-4a2b-8466-dab0a3152b26\") " Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.968304 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-datadir" (OuterVolumeSpecName: "datadir") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.968599 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.968914 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config" (OuterVolumeSpecName: "config") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.969017 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.969164 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.973449 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-metrics" (OuterVolumeSpecName: "metrics") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.974548 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-sa-token" (OuterVolumeSpecName: "sa-token") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.974543 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.976849 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-kube-api-access-6tnwl" (OuterVolumeSpecName: "kube-api-access-6tnwl") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "kube-api-access-6tnwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.982586 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-tmp" (OuterVolumeSpecName: "tmp") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.984752 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-token" (OuterVolumeSpecName: "collector-token") pod "5a91ae6a-af0f-4a2b-8466-dab0a3152b26" (UID: "5a91ae6a-af0f-4a2b-8466-dab0a3152b26"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:30:26 crc kubenswrapper[4834]: I0130 21:30:26.989428 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068558 4834 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068627 4834 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-sa-token\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068646 4834 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-tmp\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068663 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068681 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tnwl\" (UniqueName: \"kubernetes.io/projected/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-kube-api-access-6tnwl\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068698 4834 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-metrics\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068714 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068732 4834 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068752 4834 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068771 4834 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-datadir\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.068787 4834 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/5a91ae6a-af0f-4a2b-8466-dab0a3152b26-entrypoint\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.755858 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-n2bmz" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.818976 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-n2bmz"] Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.827171 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-n2bmz"] Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.833664 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.837641 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-zjpxd"] Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.839083 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.859876 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.861403 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.861679 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-rtbk2" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.861855 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.864647 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.869089 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zjpxd"] Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.883647 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.980807 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-trusted-ca\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.980895 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-metrics\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.980927 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/854d0b76-06df-4a28-ad4e-b50396ef3248-datadir\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.980947 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-config-openshift-service-cacrt\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.980967 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-collector-token\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.980986 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvjlp\" (UniqueName: \"kubernetes.io/projected/854d0b76-06df-4a28-ad4e-b50396ef3248-kube-api-access-jvjlp\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.981004 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-config\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.981024 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/854d0b76-06df-4a28-ad4e-b50396ef3248-sa-token\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.981166 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-entrypoint\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.981309 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/854d0b76-06df-4a28-ad4e-b50396ef3248-tmp\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:27 crc kubenswrapper[4834]: I0130 21:30:27.981359 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-collector-syslog-receiver\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.082838 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-config-openshift-service-cacrt\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.082914 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-collector-token\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.082943 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvjlp\" (UniqueName: \"kubernetes.io/projected/854d0b76-06df-4a28-ad4e-b50396ef3248-kube-api-access-jvjlp\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.082970 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-config\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083001 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/854d0b76-06df-4a28-ad4e-b50396ef3248-sa-token\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083028 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-entrypoint\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/854d0b76-06df-4a28-ad4e-b50396ef3248-tmp\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-collector-syslog-receiver\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083163 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-trusted-ca\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083209 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-metrics\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083242 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/854d0b76-06df-4a28-ad4e-b50396ef3248-datadir\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/854d0b76-06df-4a28-ad4e-b50396ef3248-datadir\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.083736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-config-openshift-service-cacrt\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.084131 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-config\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.084153 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-entrypoint\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.084856 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/854d0b76-06df-4a28-ad4e-b50396ef3248-trusted-ca\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.087126 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-collector-token\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.087188 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-metrics\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.088260 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/854d0b76-06df-4a28-ad4e-b50396ef3248-collector-syslog-receiver\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.088505 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/854d0b76-06df-4a28-ad4e-b50396ef3248-tmp\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.103451 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/854d0b76-06df-4a28-ad4e-b50396ef3248-sa-token\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.104948 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvjlp\" (UniqueName: \"kubernetes.io/projected/854d0b76-06df-4a28-ad4e-b50396ef3248-kube-api-access-jvjlp\") pod \"collector-zjpxd\" (UID: \"854d0b76-06df-4a28-ad4e-b50396ef3248\") " pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.173420 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl729"] Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.184264 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-zjpxd" Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.590696 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-zjpxd"] Jan 30 21:30:28 crc kubenswrapper[4834]: I0130 21:30:28.767189 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-zjpxd" event={"ID":"854d0b76-06df-4a28-ad4e-b50396ef3248","Type":"ContainerStarted","Data":"ad43d05383a8fae4f291c905c4ac9eaae456abc2e3ca863e2180d3b644085bdf"} Jan 30 21:30:29 crc kubenswrapper[4834]: I0130 21:30:29.542358 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a91ae6a-af0f-4a2b-8466-dab0a3152b26" path="/var/lib/kubelet/pods/5a91ae6a-af0f-4a2b-8466-dab0a3152b26/volumes" Jan 30 21:30:29 crc kubenswrapper[4834]: I0130 21:30:29.774921 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jl729" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="registry-server" containerID="cri-o://c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6" gracePeriod=2 Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.687989 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.797783 4834 generic.go:334] "Generic (PLEG): container finished" podID="895e1507-aebd-4f35-b07e-2074bda88037" containerID="c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6" exitCode=0 Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.797846 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerDied","Data":"c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6"} Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.798430 4834 scope.go:117] "RemoveContainer" containerID="c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.798442 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl729" event={"ID":"895e1507-aebd-4f35-b07e-2074bda88037","Type":"ContainerDied","Data":"d48777dabbe9add62528dfc1a00239583d0270f37319f9590ebcc39b4dce9f0e"} Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.798622 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jl729" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.817792 4834 scope.go:117] "RemoveContainer" containerID="b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.821171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-utilities\") pod \"895e1507-aebd-4f35-b07e-2074bda88037\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.821220 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-catalog-content\") pod \"895e1507-aebd-4f35-b07e-2074bda88037\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.821519 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tt2\" (UniqueName: \"kubernetes.io/projected/895e1507-aebd-4f35-b07e-2074bda88037-kube-api-access-g2tt2\") pod \"895e1507-aebd-4f35-b07e-2074bda88037\" (UID: \"895e1507-aebd-4f35-b07e-2074bda88037\") " Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.823510 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-utilities" (OuterVolumeSpecName: "utilities") pod "895e1507-aebd-4f35-b07e-2074bda88037" (UID: "895e1507-aebd-4f35-b07e-2074bda88037"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.830082 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895e1507-aebd-4f35-b07e-2074bda88037-kube-api-access-g2tt2" (OuterVolumeSpecName: "kube-api-access-g2tt2") pod "895e1507-aebd-4f35-b07e-2074bda88037" (UID: "895e1507-aebd-4f35-b07e-2074bda88037"). InnerVolumeSpecName "kube-api-access-g2tt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.843363 4834 scope.go:117] "RemoveContainer" containerID="91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.848529 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "895e1507-aebd-4f35-b07e-2074bda88037" (UID: "895e1507-aebd-4f35-b07e-2074bda88037"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.882364 4834 scope.go:117] "RemoveContainer" containerID="c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6" Jan 30 21:30:30 crc kubenswrapper[4834]: E0130 21:30:30.883075 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6\": container with ID starting with c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6 not found: ID does not exist" containerID="c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.883144 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6"} err="failed to get container status \"c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6\": rpc error: code = NotFound desc = could not find container \"c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6\": container with ID starting with c274a9ea2a8eee70a759e95845f05fc5acc812f123e09acd9704a07668ad42c6 not found: ID does not exist" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.883222 4834 scope.go:117] "RemoveContainer" containerID="b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41" Jan 30 21:30:30 crc kubenswrapper[4834]: E0130 21:30:30.883790 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41\": container with ID starting with b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41 not found: ID does not exist" containerID="b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.883837 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41"} err="failed to get container status \"b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41\": rpc error: code = NotFound desc = could not find container \"b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41\": container with ID starting with b516ce895412e990df882b86aedf72bd09272464428c1bb715c59e9afaa3ed41 not found: ID does not exist" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.883871 4834 scope.go:117] "RemoveContainer" containerID="91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe" Jan 30 21:30:30 crc kubenswrapper[4834]: E0130 21:30:30.884371 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe\": container with ID starting with 91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe not found: ID does not exist" containerID="91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.884418 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe"} err="failed to get container status \"91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe\": rpc error: code = NotFound desc = could not find container \"91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe\": container with ID starting with 91cd7dd230200ab662d582c86e50ae167595669c178d682a967c5d7b219d07fe not found: ID does not exist" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.922775 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.922813 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/895e1507-aebd-4f35-b07e-2074bda88037-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:30 crc kubenswrapper[4834]: I0130 21:30:30.922828 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tt2\" (UniqueName: \"kubernetes.io/projected/895e1507-aebd-4f35-b07e-2074bda88037-kube-api-access-g2tt2\") on node \"crc\" DevicePath \"\"" Jan 30 21:30:31 crc kubenswrapper[4834]: I0130 21:30:31.140703 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl729"] Jan 30 21:30:31 crc kubenswrapper[4834]: I0130 21:30:31.146487 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl729"] Jan 30 21:30:31 crc kubenswrapper[4834]: I0130 21:30:31.543475 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895e1507-aebd-4f35-b07e-2074bda88037" path="/var/lib/kubelet/pods/895e1507-aebd-4f35-b07e-2074bda88037/volumes" Jan 30 21:30:34 crc kubenswrapper[4834]: I0130 21:30:34.834563 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-zjpxd" event={"ID":"854d0b76-06df-4a28-ad4e-b50396ef3248","Type":"ContainerStarted","Data":"d8382ead468abecc81f10a169b5fc83f3bc87cac4967c8bc17d02340497c9202"} Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.253858 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-zjpxd" podStartSLOduration=26.226012743 podStartE2EDuration="32.253833871s" podCreationTimestamp="2026-01-30 21:30:27 +0000 UTC" firstStartedPulling="2026-01-30 21:30:28.601490802 +0000 UTC m=+879.754636940" lastFinishedPulling="2026-01-30 21:30:34.6293119 +0000 UTC m=+885.782458068" observedRunningTime="2026-01-30 21:30:34.858839264 +0000 UTC m=+886.011985422" watchObservedRunningTime="2026-01-30 21:30:59.253833871 +0000 UTC m=+910.406980009" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.255317 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg"] Jan 30 21:30:59 crc kubenswrapper[4834]: E0130 21:30:59.255631 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="extract-utilities" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.255650 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="extract-utilities" Jan 30 21:30:59 crc kubenswrapper[4834]: E0130 21:30:59.255661 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="registry-server" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.255667 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="registry-server" Jan 30 21:30:59 crc kubenswrapper[4834]: E0130 21:30:59.255697 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="extract-content" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.255703 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="extract-content" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.255828 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="895e1507-aebd-4f35-b07e-2074bda88037" containerName="registry-server" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.258144 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.262285 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.270606 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg"] Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.324609 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.324659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.324727 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznsn\" (UniqueName: \"kubernetes.io/projected/390635a4-5536-4897-b656-587cf2dbf6dc-kube-api-access-gznsn\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.426207 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznsn\" (UniqueName: \"kubernetes.io/projected/390635a4-5536-4897-b656-587cf2dbf6dc-kube-api-access-gznsn\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.426351 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.426379 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.426954 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.426954 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.454174 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznsn\" (UniqueName: \"kubernetes.io/projected/390635a4-5536-4897-b656-587cf2dbf6dc-kube-api-access-gznsn\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:30:59 crc kubenswrapper[4834]: I0130 21:30:59.587875 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:31:00 crc kubenswrapper[4834]: I0130 21:31:00.024148 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg"] Jan 30 21:31:01 crc kubenswrapper[4834]: I0130 21:31:01.049152 4834 generic.go:334] "Generic (PLEG): container finished" podID="390635a4-5536-4897-b656-587cf2dbf6dc" containerID="a253aa9f73f1c9f1e6a7e9dee2292fa308e80d7c40793de514b4d423908fc020" exitCode=0 Jan 30 21:31:01 crc kubenswrapper[4834]: I0130 21:31:01.049219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" event={"ID":"390635a4-5536-4897-b656-587cf2dbf6dc","Type":"ContainerDied","Data":"a253aa9f73f1c9f1e6a7e9dee2292fa308e80d7c40793de514b4d423908fc020"} Jan 30 21:31:01 crc kubenswrapper[4834]: I0130 21:31:01.049535 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" event={"ID":"390635a4-5536-4897-b656-587cf2dbf6dc","Type":"ContainerStarted","Data":"909f027852b4e48db848653dae8d26fe7d489fcaadf8bdaaa2bc9e82b827fda9"} Jan 30 21:31:03 crc kubenswrapper[4834]: I0130 21:31:03.062296 4834 generic.go:334] "Generic (PLEG): container finished" podID="390635a4-5536-4897-b656-587cf2dbf6dc" containerID="4b662422ffa0311c93d2a77c59de4e6415438252ba821f36596a9f258feac286" exitCode=0 Jan 30 21:31:03 crc kubenswrapper[4834]: I0130 21:31:03.062368 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" event={"ID":"390635a4-5536-4897-b656-587cf2dbf6dc","Type":"ContainerDied","Data":"4b662422ffa0311c93d2a77c59de4e6415438252ba821f36596a9f258feac286"} Jan 30 21:31:04 crc kubenswrapper[4834]: I0130 21:31:04.073216 4834 generic.go:334] "Generic (PLEG): container finished" podID="390635a4-5536-4897-b656-587cf2dbf6dc" containerID="b4d0694340880b137ea90cb4328d691cc014452722c4062306efb4c4918330a1" exitCode=0 Jan 30 21:31:04 crc kubenswrapper[4834]: I0130 21:31:04.073288 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" event={"ID":"390635a4-5536-4897-b656-587cf2dbf6dc","Type":"ContainerDied","Data":"b4d0694340880b137ea90cb4328d691cc014452722c4062306efb4c4918330a1"} Jan 30 21:31:04 crc kubenswrapper[4834]: I0130 21:31:04.160811 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:31:04 crc kubenswrapper[4834]: I0130 21:31:04.160903 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.385751 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.513675 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-util\") pod \"390635a4-5536-4897-b656-587cf2dbf6dc\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.513960 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-bundle\") pod \"390635a4-5536-4897-b656-587cf2dbf6dc\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.514070 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznsn\" (UniqueName: \"kubernetes.io/projected/390635a4-5536-4897-b656-587cf2dbf6dc-kube-api-access-gznsn\") pod \"390635a4-5536-4897-b656-587cf2dbf6dc\" (UID: \"390635a4-5536-4897-b656-587cf2dbf6dc\") " Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.514663 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-bundle" (OuterVolumeSpecName: "bundle") pod "390635a4-5536-4897-b656-587cf2dbf6dc" (UID: "390635a4-5536-4897-b656-587cf2dbf6dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.520317 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390635a4-5536-4897-b656-587cf2dbf6dc-kube-api-access-gznsn" (OuterVolumeSpecName: "kube-api-access-gznsn") pod "390635a4-5536-4897-b656-587cf2dbf6dc" (UID: "390635a4-5536-4897-b656-587cf2dbf6dc"). InnerVolumeSpecName "kube-api-access-gznsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.615649 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.615682 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznsn\" (UniqueName: \"kubernetes.io/projected/390635a4-5536-4897-b656-587cf2dbf6dc-kube-api-access-gznsn\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.890062 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-util" (OuterVolumeSpecName: "util") pod "390635a4-5536-4897-b656-587cf2dbf6dc" (UID: "390635a4-5536-4897-b656-587cf2dbf6dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:31:05 crc kubenswrapper[4834]: I0130 21:31:05.919306 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/390635a4-5536-4897-b656-587cf2dbf6dc-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:06 crc kubenswrapper[4834]: I0130 21:31:06.091745 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" event={"ID":"390635a4-5536-4897-b656-587cf2dbf6dc","Type":"ContainerDied","Data":"909f027852b4e48db848653dae8d26fe7d489fcaadf8bdaaa2bc9e82b827fda9"} Jan 30 21:31:06 crc kubenswrapper[4834]: I0130 21:31:06.091816 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg" Jan 30 21:31:06 crc kubenswrapper[4834]: I0130 21:31:06.091823 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="909f027852b4e48db848653dae8d26fe7d489fcaadf8bdaaa2bc9e82b827fda9" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.820506 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-94zx2"] Jan 30 21:31:07 crc kubenswrapper[4834]: E0130 21:31:07.821108 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="extract" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.821124 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="extract" Jan 30 21:31:07 crc kubenswrapper[4834]: E0130 21:31:07.821133 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="pull" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.821141 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="pull" Jan 30 21:31:07 crc kubenswrapper[4834]: E0130 21:31:07.821155 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="util" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.821162 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="util" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.821289 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="390635a4-5536-4897-b656-587cf2dbf6dc" containerName="extract" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.821830 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.826538 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-fk6pb" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.826608 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.826651 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.845042 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-94zx2"] Jan 30 21:31:07 crc kubenswrapper[4834]: I0130 21:31:07.943488 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vkjl\" (UniqueName: \"kubernetes.io/projected/acad5879-77cc-49f6-89c4-ccc5e97b9c4e-kube-api-access-6vkjl\") pod \"nmstate-operator-646758c888-94zx2\" (UID: \"acad5879-77cc-49f6-89c4-ccc5e97b9c4e\") " pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" Jan 30 21:31:08 crc kubenswrapper[4834]: I0130 21:31:08.045139 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkjl\" (UniqueName: \"kubernetes.io/projected/acad5879-77cc-49f6-89c4-ccc5e97b9c4e-kube-api-access-6vkjl\") pod \"nmstate-operator-646758c888-94zx2\" (UID: \"acad5879-77cc-49f6-89c4-ccc5e97b9c4e\") " pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" Jan 30 21:31:08 crc kubenswrapper[4834]: I0130 21:31:08.063277 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vkjl\" (UniqueName: \"kubernetes.io/projected/acad5879-77cc-49f6-89c4-ccc5e97b9c4e-kube-api-access-6vkjl\") pod \"nmstate-operator-646758c888-94zx2\" (UID: \"acad5879-77cc-49f6-89c4-ccc5e97b9c4e\") " pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" Jan 30 21:31:08 crc kubenswrapper[4834]: I0130 21:31:08.143120 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" Jan 30 21:31:08 crc kubenswrapper[4834]: I0130 21:31:08.670883 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-94zx2"] Jan 30 21:31:09 crc kubenswrapper[4834]: I0130 21:31:09.112713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" event={"ID":"acad5879-77cc-49f6-89c4-ccc5e97b9c4e","Type":"ContainerStarted","Data":"40261b3281746220f66b98eac32b11c647dd40373a3e7e38be243a7ddec1dd24"} Jan 30 21:31:11 crc kubenswrapper[4834]: I0130 21:31:11.128819 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" event={"ID":"acad5879-77cc-49f6-89c4-ccc5e97b9c4e","Type":"ContainerStarted","Data":"b39686f6d6dc924cde987f26abfaef89de560d6411e91c0517981607de0d9018"} Jan 30 21:31:11 crc kubenswrapper[4834]: I0130 21:31:11.149656 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-94zx2" podStartSLOduration=2.242201041 podStartE2EDuration="4.14963262s" podCreationTimestamp="2026-01-30 21:31:07 +0000 UTC" firstStartedPulling="2026-01-30 21:31:08.685299948 +0000 UTC m=+919.838446086" lastFinishedPulling="2026-01-30 21:31:10.592731507 +0000 UTC m=+921.745877665" observedRunningTime="2026-01-30 21:31:11.144265298 +0000 UTC m=+922.297411436" watchObservedRunningTime="2026-01-30 21:31:11.14963262 +0000 UTC m=+922.302778758" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.155257 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-tx5rx"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.156436 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.158832 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tkllg" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.171834 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-tx5rx"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.187824 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.189201 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-627sx"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.189940 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.190185 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.206029 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.223728 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.296321 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.297096 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.305824 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mwc58" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.305862 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.305909 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-ovs-socket\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306032 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306306 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5zs\" (UniqueName: \"kubernetes.io/projected/acb1f436-a969-4b96-a54a-0228575c680b-kube-api-access-hd5zs\") pod \"nmstate-metrics-54757c584b-tx5rx\" (UID: \"acb1f436-a969-4b96-a54a-0228575c680b\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306341 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-nmstate-lock\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306376 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk5q5\" (UniqueName: \"kubernetes.io/projected/cff9b6ad-5a56-4911-876a-51c7a25619c4-kube-api-access-sk5q5\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306432 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-dbus-socket\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306449 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztrmt\" (UniqueName: \"kubernetes.io/projected/e7699088-70c6-4994-ad05-6ae59420798c-kube-api-access-ztrmt\") pod \"nmstate-webhook-8474b5b9d8-frjh4\" (UID: \"e7699088-70c6-4994-ad05-6ae59420798c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.306463 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e7699088-70c6-4994-ad05-6ae59420798c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-frjh4\" (UID: \"e7699088-70c6-4994-ad05-6ae59420798c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.323327 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.407858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5zs\" (UniqueName: \"kubernetes.io/projected/acb1f436-a969-4b96-a54a-0228575c680b-kube-api-access-hd5zs\") pod \"nmstate-metrics-54757c584b-tx5rx\" (UID: \"acb1f436-a969-4b96-a54a-0228575c680b\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.407912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-nmstate-lock\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408042 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk5q5\" (UniqueName: \"kubernetes.io/projected/cff9b6ad-5a56-4911-876a-51c7a25619c4-kube-api-access-sk5q5\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408110 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-nmstate-lock\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-dbus-socket\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408177 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrmt\" (UniqueName: \"kubernetes.io/projected/e7699088-70c6-4994-ad05-6ae59420798c-kube-api-access-ztrmt\") pod \"nmstate-webhook-8474b5b9d8-frjh4\" (UID: \"e7699088-70c6-4994-ad05-6ae59420798c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408202 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e7699088-70c6-4994-ad05-6ae59420798c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-frjh4\" (UID: \"e7699088-70c6-4994-ad05-6ae59420798c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408236 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408289 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vfr\" (UniqueName: \"kubernetes.io/projected/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-kube-api-access-95vfr\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408318 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-ovs-socket\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408345 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-dbus-socket\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408362 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.408454 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/cff9b6ad-5a56-4911-876a-51c7a25619c4-ovs-socket\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.414626 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e7699088-70c6-4994-ad05-6ae59420798c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-frjh4\" (UID: \"e7699088-70c6-4994-ad05-6ae59420798c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.437433 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztrmt\" (UniqueName: \"kubernetes.io/projected/e7699088-70c6-4994-ad05-6ae59420798c-kube-api-access-ztrmt\") pod \"nmstate-webhook-8474b5b9d8-frjh4\" (UID: \"e7699088-70c6-4994-ad05-6ae59420798c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.439664 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5zs\" (UniqueName: \"kubernetes.io/projected/acb1f436-a969-4b96-a54a-0228575c680b-kube-api-access-hd5zs\") pod \"nmstate-metrics-54757c584b-tx5rx\" (UID: \"acb1f436-a969-4b96-a54a-0228575c680b\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.441422 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk5q5\" (UniqueName: \"kubernetes.io/projected/cff9b6ad-5a56-4911-876a-51c7a25619c4-kube-api-access-sk5q5\") pod \"nmstate-handler-627sx\" (UID: \"cff9b6ad-5a56-4911-876a-51c7a25619c4\") " pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.470166 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.495167 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-659f5896b9-h2b8g"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.496580 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.507884 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-659f5896b9-h2b8g"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510146 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-oauth-serving-cert\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-oauth-config\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510334 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510371 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-config\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510418 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-trusted-ca-bundle\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vfr\" (UniqueName: \"kubernetes.io/projected/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-kube-api-access-95vfr\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510526 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-serving-cert\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510555 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blsz7\" (UniqueName: \"kubernetes.io/projected/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-kube-api-access-blsz7\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.510618 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-service-ca\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.511705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.514020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.523048 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.534754 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.540019 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vfr\" (UniqueName: \"kubernetes.io/projected/b1cebcb5-34d7-4e5d-b5bb-569ed874a27c-kube-api-access-95vfr\") pod \"nmstate-console-plugin-7754f76f8b-4kdm4\" (UID: \"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: W0130 21:31:12.562674 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcff9b6ad_5a56_4911_876a_51c7a25619c4.slice/crio-308f4b92427bd2c2bb678a1588d1e5828301f4dfc36d23798e15eef083b1a93e WatchSource:0}: Error finding container 308f4b92427bd2c2bb678a1588d1e5828301f4dfc36d23798e15eef083b1a93e: Status 404 returned error can't find the container with id 308f4b92427bd2c2bb678a1588d1e5828301f4dfc36d23798e15eef083b1a93e Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.610932 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-serving-cert\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.610995 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blsz7\" (UniqueName: \"kubernetes.io/projected/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-kube-api-access-blsz7\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.611029 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-service-ca\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.611056 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-oauth-serving-cert\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.611100 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-oauth-config\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.611125 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-config\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.611142 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-trusted-ca-bundle\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.612015 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-service-ca\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.612317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-trusted-ca-bundle\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.612615 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-config\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.612790 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-oauth-serving-cert\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.615959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-serving-cert\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.616309 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-console-oauth-config\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.620321 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.629740 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blsz7\" (UniqueName: \"kubernetes.io/projected/2d0e97a2-e239-441f-9c40-b0346b9b2ab8-kube-api-access-blsz7\") pod \"console-659f5896b9-h2b8g\" (UID: \"2d0e97a2-e239-441f-9c40-b0346b9b2ab8\") " pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.721082 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-tx5rx"] Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.857726 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:12 crc kubenswrapper[4834]: I0130 21:31:12.966299 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4"] Jan 30 21:31:13 crc kubenswrapper[4834]: I0130 21:31:13.054290 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4"] Jan 30 21:31:13 crc kubenswrapper[4834]: I0130 21:31:13.139709 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" event={"ID":"acb1f436-a969-4b96-a54a-0228575c680b","Type":"ContainerStarted","Data":"536b45cd1d18d284f72afdeeaeae3705d2e88cbde3fd3719606354094cfa9a44"} Jan 30 21:31:13 crc kubenswrapper[4834]: I0130 21:31:13.141829 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-627sx" event={"ID":"cff9b6ad-5a56-4911-876a-51c7a25619c4","Type":"ContainerStarted","Data":"308f4b92427bd2c2bb678a1588d1e5828301f4dfc36d23798e15eef083b1a93e"} Jan 30 21:31:13 crc kubenswrapper[4834]: I0130 21:31:13.143888 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" event={"ID":"e7699088-70c6-4994-ad05-6ae59420798c","Type":"ContainerStarted","Data":"5a06f2afdf0e447ef989b53a4e280e3e628640b8eb2c37f94e331011cd5da10c"} Jan 30 21:31:13 crc kubenswrapper[4834]: I0130 21:31:13.145381 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" event={"ID":"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c","Type":"ContainerStarted","Data":"41fab3c8e8afb46dcd425228c6e4ba6a0e79f0e5995861a872c4b3ae5d85e904"} Jan 30 21:31:13 crc kubenswrapper[4834]: I0130 21:31:13.281810 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-659f5896b9-h2b8g"] Jan 30 21:31:14 crc kubenswrapper[4834]: I0130 21:31:14.153842 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-659f5896b9-h2b8g" event={"ID":"2d0e97a2-e239-441f-9c40-b0346b9b2ab8","Type":"ContainerStarted","Data":"aee804d90e454d44cd26f2610254fb7da7610d4052777885669975bd3ebf92a6"} Jan 30 21:31:14 crc kubenswrapper[4834]: I0130 21:31:14.154088 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-659f5896b9-h2b8g" event={"ID":"2d0e97a2-e239-441f-9c40-b0346b9b2ab8","Type":"ContainerStarted","Data":"5d07c1cc3d0cd3bf6edf4499b2eb5c86a0a8b9c8bb6e34cc5d78b39bc0c37096"} Jan 30 21:31:14 crc kubenswrapper[4834]: I0130 21:31:14.177568 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-659f5896b9-h2b8g" podStartSLOduration=2.17754883 podStartE2EDuration="2.17754883s" podCreationTimestamp="2026-01-30 21:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:31:14.171526511 +0000 UTC m=+925.324672649" watchObservedRunningTime="2026-01-30 21:31:14.17754883 +0000 UTC m=+925.330694968" Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.173530 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" event={"ID":"e7699088-70c6-4994-ad05-6ae59420798c","Type":"ContainerStarted","Data":"e34e1361abc99f219be8c1c48aadf2725a07c7be0e85759c186ecf8f4f11860f"} Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.173960 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.174989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" event={"ID":"b1cebcb5-34d7-4e5d-b5bb-569ed874a27c","Type":"ContainerStarted","Data":"f5182912d81c7957f4cc5f215cc123db5cdd26eb08bc77ef5f904181c70624fc"} Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.176210 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" event={"ID":"acb1f436-a969-4b96-a54a-0228575c680b","Type":"ContainerStarted","Data":"cd38758ac5d8cad51c8a13ec9a4564e956edf6c8cb9aa189684f2cd7d7001371"} Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.178696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-627sx" event={"ID":"cff9b6ad-5a56-4911-876a-51c7a25619c4","Type":"ContainerStarted","Data":"c44a7b75a6d4d49e2e5c66770dd44793918a61587003811cd81b3490aff447ab"} Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.178854 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.207655 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" podStartSLOduration=2.224977167 podStartE2EDuration="5.207636212s" podCreationTimestamp="2026-01-30 21:31:12 +0000 UTC" firstStartedPulling="2026-01-30 21:31:12.976077665 +0000 UTC m=+924.129223803" lastFinishedPulling="2026-01-30 21:31:15.95873671 +0000 UTC m=+927.111882848" observedRunningTime="2026-01-30 21:31:17.194314757 +0000 UTC m=+928.347460895" watchObservedRunningTime="2026-01-30 21:31:17.207636212 +0000 UTC m=+928.360782350" Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.221148 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4kdm4" podStartSLOduration=2.334031762 podStartE2EDuration="5.221127633s" podCreationTimestamp="2026-01-30 21:31:12 +0000 UTC" firstStartedPulling="2026-01-30 21:31:13.071466314 +0000 UTC m=+924.224612442" lastFinishedPulling="2026-01-30 21:31:15.958562175 +0000 UTC m=+927.111708313" observedRunningTime="2026-01-30 21:31:17.219758804 +0000 UTC m=+928.372904942" watchObservedRunningTime="2026-01-30 21:31:17.221127633 +0000 UTC m=+928.374273781" Jan 30 21:31:17 crc kubenswrapper[4834]: I0130 21:31:17.225248 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-627sx" podStartSLOduration=1.825480524 podStartE2EDuration="5.225227278s" podCreationTimestamp="2026-01-30 21:31:12 +0000 UTC" firstStartedPulling="2026-01-30 21:31:12.566073515 +0000 UTC m=+923.719219653" lastFinishedPulling="2026-01-30 21:31:15.965820269 +0000 UTC m=+927.118966407" observedRunningTime="2026-01-30 21:31:17.210364609 +0000 UTC m=+928.363510747" watchObservedRunningTime="2026-01-30 21:31:17.225227278 +0000 UTC m=+928.378373416" Jan 30 21:31:19 crc kubenswrapper[4834]: I0130 21:31:19.202633 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" event={"ID":"acb1f436-a969-4b96-a54a-0228575c680b","Type":"ContainerStarted","Data":"9490b252f8c6738087983c29f3ed20fb072da5f20e970f9166fe57c174467d34"} Jan 30 21:31:19 crc kubenswrapper[4834]: I0130 21:31:19.241743 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-tx5rx" podStartSLOduration=1.562745376 podStartE2EDuration="7.241714343s" podCreationTimestamp="2026-01-30 21:31:12 +0000 UTC" firstStartedPulling="2026-01-30 21:31:12.731120298 +0000 UTC m=+923.884266436" lastFinishedPulling="2026-01-30 21:31:18.410089265 +0000 UTC m=+929.563235403" observedRunningTime="2026-01-30 21:31:19.235000763 +0000 UTC m=+930.388146901" watchObservedRunningTime="2026-01-30 21:31:19.241714343 +0000 UTC m=+930.394860521" Jan 30 21:31:22 crc kubenswrapper[4834]: I0130 21:31:22.563953 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-627sx" Jan 30 21:31:22 crc kubenswrapper[4834]: I0130 21:31:22.858352 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:22 crc kubenswrapper[4834]: I0130 21:31:22.858433 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:22 crc kubenswrapper[4834]: I0130 21:31:22.864440 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:23 crc kubenswrapper[4834]: I0130 21:31:23.245436 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-659f5896b9-h2b8g" Jan 30 21:31:23 crc kubenswrapper[4834]: I0130 21:31:23.334145 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4k6d4"] Jan 30 21:31:32 crc kubenswrapper[4834]: I0130 21:31:32.532579 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-frjh4" Jan 30 21:31:34 crc kubenswrapper[4834]: I0130 21:31:34.164558 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:31:34 crc kubenswrapper[4834]: I0130 21:31:34.165060 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.058837 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc"] Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.060912 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.064423 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.070665 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc"] Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.088248 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.088313 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4952p\" (UniqueName: \"kubernetes.io/projected/027a0611-c347-4644-ba70-7f12f2f9a344-kube-api-access-4952p\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.088337 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.190065 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.190432 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4952p\" (UniqueName: \"kubernetes.io/projected/027a0611-c347-4644-ba70-7f12f2f9a344-kube-api-access-4952p\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.190589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.190754 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.191285 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.215052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4952p\" (UniqueName: \"kubernetes.io/projected/027a0611-c347-4644-ba70-7f12f2f9a344-kube-api-access-4952p\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.379580 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.400128 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4k6d4" podUID="59b3b974-9ba8-426b-8836-34ecbb56f86f" containerName="console" containerID="cri-o://9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2" gracePeriod=15 Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.610492 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc"] Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.889139 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4k6d4_59b3b974-9ba8-426b-8836-34ecbb56f86f/console/0.log" Jan 30 21:31:48 crc kubenswrapper[4834]: I0130 21:31:48.889217 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001299 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-oauth-config\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001660 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-trusted-ca-bundle\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001701 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-oauth-serving-cert\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001730 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58dh2\" (UniqueName: \"kubernetes.io/projected/59b3b974-9ba8-426b-8836-34ecbb56f86f-kube-api-access-58dh2\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001795 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-serving-cert\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001820 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-service-ca\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.001837 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-config\") pod \"59b3b974-9ba8-426b-8836-34ecbb56f86f\" (UID: \"59b3b974-9ba8-426b-8836-34ecbb56f86f\") " Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.002341 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.002373 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-service-ca" (OuterVolumeSpecName: "service-ca") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.002439 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.002446 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-config" (OuterVolumeSpecName: "console-config") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.007376 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.007381 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b3b974-9ba8-426b-8836-34ecbb56f86f-kube-api-access-58dh2" (OuterVolumeSpecName: "kube-api-access-58dh2") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "kube-api-access-58dh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.007547 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "59b3b974-9ba8-426b-8836-34ecbb56f86f" (UID: "59b3b974-9ba8-426b-8836-34ecbb56f86f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103496 4834 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103553 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58dh2\" (UniqueName: \"kubernetes.io/projected/59b3b974-9ba8-426b-8836-34ecbb56f86f-kube-api-access-58dh2\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103576 4834 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103594 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-service-ca\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103613 4834 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103629 4834 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59b3b974-9ba8-426b-8836-34ecbb56f86f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.103646 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59b3b974-9ba8-426b-8836-34ecbb56f86f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.453129 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4k6d4_59b3b974-9ba8-426b-8836-34ecbb56f86f/console/0.log" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.453189 4834 generic.go:334] "Generic (PLEG): container finished" podID="59b3b974-9ba8-426b-8836-34ecbb56f86f" containerID="9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2" exitCode=2 Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.453288 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4k6d4" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.453293 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4k6d4" event={"ID":"59b3b974-9ba8-426b-8836-34ecbb56f86f","Type":"ContainerDied","Data":"9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2"} Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.453427 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4k6d4" event={"ID":"59b3b974-9ba8-426b-8836-34ecbb56f86f","Type":"ContainerDied","Data":"8250b2fab1f8cf7e7e7e4a048289f9e49c58030af2172fb4bea37699126ce415"} Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.453458 4834 scope.go:117] "RemoveContainer" containerID="9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.455868 4834 generic.go:334] "Generic (PLEG): container finished" podID="027a0611-c347-4644-ba70-7f12f2f9a344" containerID="24f41c4a81d91a1c53b8404bc8dcd25730c946ff97f8fc9f4f590aa3020e116f" exitCode=0 Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.455957 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" event={"ID":"027a0611-c347-4644-ba70-7f12f2f9a344","Type":"ContainerDied","Data":"24f41c4a81d91a1c53b8404bc8dcd25730c946ff97f8fc9f4f590aa3020e116f"} Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.455994 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" event={"ID":"027a0611-c347-4644-ba70-7f12f2f9a344","Type":"ContainerStarted","Data":"c54eae9404cff1cc945d27c8f517dbe0e5d7125056175c34a5ae6d683905f7a6"} Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.479732 4834 scope.go:117] "RemoveContainer" containerID="9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2" Jan 30 21:31:49 crc kubenswrapper[4834]: E0130 21:31:49.480324 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2\": container with ID starting with 9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2 not found: ID does not exist" containerID="9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.480368 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2"} err="failed to get container status \"9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2\": rpc error: code = NotFound desc = could not find container \"9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2\": container with ID starting with 9c4bb52c39b656839f219aec42c8b7daef25ce18734c33fd89df2276bcc288a2 not found: ID does not exist" Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.499150 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4k6d4"] Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.502997 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4k6d4"] Jan 30 21:31:49 crc kubenswrapper[4834]: I0130 21:31:49.539951 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b3b974-9ba8-426b-8836-34ecbb56f86f" path="/var/lib/kubelet/pods/59b3b974-9ba8-426b-8836-34ecbb56f86f/volumes" Jan 30 21:31:51 crc kubenswrapper[4834]: I0130 21:31:51.479065 4834 generic.go:334] "Generic (PLEG): container finished" podID="027a0611-c347-4644-ba70-7f12f2f9a344" containerID="ecc60bcf9e2f61045ab7ef3152e84d024dfbf030fe95b59f80a042d96da25603" exitCode=0 Jan 30 21:31:51 crc kubenswrapper[4834]: I0130 21:31:51.479203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" event={"ID":"027a0611-c347-4644-ba70-7f12f2f9a344","Type":"ContainerDied","Data":"ecc60bcf9e2f61045ab7ef3152e84d024dfbf030fe95b59f80a042d96da25603"} Jan 30 21:31:52 crc kubenswrapper[4834]: I0130 21:31:52.490256 4834 generic.go:334] "Generic (PLEG): container finished" podID="027a0611-c347-4644-ba70-7f12f2f9a344" containerID="5ec059cc6c0644bb3c3446b745d326897791574147fbadc632c3cd2a432d4639" exitCode=0 Jan 30 21:31:52 crc kubenswrapper[4834]: I0130 21:31:52.490376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" event={"ID":"027a0611-c347-4644-ba70-7f12f2f9a344","Type":"ContainerDied","Data":"5ec059cc6c0644bb3c3446b745d326897791574147fbadc632c3cd2a432d4639"} Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.777761 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.877214 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-bundle\") pod \"027a0611-c347-4644-ba70-7f12f2f9a344\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.877261 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4952p\" (UniqueName: \"kubernetes.io/projected/027a0611-c347-4644-ba70-7f12f2f9a344-kube-api-access-4952p\") pod \"027a0611-c347-4644-ba70-7f12f2f9a344\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.877351 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-util\") pod \"027a0611-c347-4644-ba70-7f12f2f9a344\" (UID: \"027a0611-c347-4644-ba70-7f12f2f9a344\") " Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.878196 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-bundle" (OuterVolumeSpecName: "bundle") pod "027a0611-c347-4644-ba70-7f12f2f9a344" (UID: "027a0611-c347-4644-ba70-7f12f2f9a344"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.882691 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027a0611-c347-4644-ba70-7f12f2f9a344-kube-api-access-4952p" (OuterVolumeSpecName: "kube-api-access-4952p") pod "027a0611-c347-4644-ba70-7f12f2f9a344" (UID: "027a0611-c347-4644-ba70-7f12f2f9a344"). InnerVolumeSpecName "kube-api-access-4952p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.889910 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-util" (OuterVolumeSpecName: "util") pod "027a0611-c347-4644-ba70-7f12f2f9a344" (UID: "027a0611-c347-4644-ba70-7f12f2f9a344"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.978502 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.978539 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/027a0611-c347-4644-ba70-7f12f2f9a344-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:53 crc kubenswrapper[4834]: I0130 21:31:53.978550 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4952p\" (UniqueName: \"kubernetes.io/projected/027a0611-c347-4644-ba70-7f12f2f9a344-kube-api-access-4952p\") on node \"crc\" DevicePath \"\"" Jan 30 21:31:54 crc kubenswrapper[4834]: I0130 21:31:54.522278 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" event={"ID":"027a0611-c347-4644-ba70-7f12f2f9a344","Type":"ContainerDied","Data":"c54eae9404cff1cc945d27c8f517dbe0e5d7125056175c34a5ae6d683905f7a6"} Jan 30 21:31:54 crc kubenswrapper[4834]: I0130 21:31:54.522327 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c54eae9404cff1cc945d27c8f517dbe0e5d7125056175c34a5ae6d683905f7a6" Jan 30 21:31:54 crc kubenswrapper[4834]: I0130 21:31:54.522415 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.366379 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4"] Jan 30 21:32:03 crc kubenswrapper[4834]: E0130 21:32:03.367033 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b3b974-9ba8-426b-8836-34ecbb56f86f" containerName="console" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367045 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b3b974-9ba8-426b-8836-34ecbb56f86f" containerName="console" Jan 30 21:32:03 crc kubenswrapper[4834]: E0130 21:32:03.367054 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="util" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367060 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="util" Jan 30 21:32:03 crc kubenswrapper[4834]: E0130 21:32:03.367070 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="pull" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367087 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="pull" Jan 30 21:32:03 crc kubenswrapper[4834]: E0130 21:32:03.367098 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="extract" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367104 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="extract" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367210 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b3b974-9ba8-426b-8836-34ecbb56f86f" containerName="console" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367223 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="027a0611-c347-4644-ba70-7f12f2f9a344" containerName="extract" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.367703 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.369841 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.370088 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.370205 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.370310 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xbgxs" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.377242 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.381644 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4"] Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.440757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmbqm\" (UniqueName: \"kubernetes.io/projected/db5dfdf6-696e-40f9-95a6-baa1b909a02e-kube-api-access-vmbqm\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.440820 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db5dfdf6-696e-40f9-95a6-baa1b909a02e-apiservice-cert\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.440840 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db5dfdf6-696e-40f9-95a6-baa1b909a02e-webhook-cert\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.542238 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db5dfdf6-696e-40f9-95a6-baa1b909a02e-apiservice-cert\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.542288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db5dfdf6-696e-40f9-95a6-baa1b909a02e-webhook-cert\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.543278 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmbqm\" (UniqueName: \"kubernetes.io/projected/db5dfdf6-696e-40f9-95a6-baa1b909a02e-kube-api-access-vmbqm\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.548480 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db5dfdf6-696e-40f9-95a6-baa1b909a02e-apiservice-cert\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.563641 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db5dfdf6-696e-40f9-95a6-baa1b909a02e-webhook-cert\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.570076 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmbqm\" (UniqueName: \"kubernetes.io/projected/db5dfdf6-696e-40f9-95a6-baa1b909a02e-kube-api-access-vmbqm\") pod \"metallb-operator-controller-manager-6dfb4f7bb8-zhhz4\" (UID: \"db5dfdf6-696e-40f9-95a6-baa1b909a02e\") " pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.681042 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q"] Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.681805 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.683613 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.683622 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.684658 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-x42sk" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.690289 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.700873 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q"] Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.850386 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cce27a06-72ce-4b87-be28-e71501ec9291-apiservice-cert\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.850479 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cce27a06-72ce-4b87-be28-e71501ec9291-webhook-cert\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.850504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6gl\" (UniqueName: \"kubernetes.io/projected/cce27a06-72ce-4b87-be28-e71501ec9291-kube-api-access-bs6gl\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.952375 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cce27a06-72ce-4b87-be28-e71501ec9291-webhook-cert\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.952451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6gl\" (UniqueName: \"kubernetes.io/projected/cce27a06-72ce-4b87-be28-e71501ec9291-kube-api-access-bs6gl\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.952574 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cce27a06-72ce-4b87-be28-e71501ec9291-apiservice-cert\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.961340 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cce27a06-72ce-4b87-be28-e71501ec9291-apiservice-cert\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.966045 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cce27a06-72ce-4b87-be28-e71501ec9291-webhook-cert\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:03 crc kubenswrapper[4834]: I0130 21:32:03.977233 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6gl\" (UniqueName: \"kubernetes.io/projected/cce27a06-72ce-4b87-be28-e71501ec9291-kube-api-access-bs6gl\") pod \"metallb-operator-webhook-server-6c67b9b9df-6cd7q\" (UID: \"cce27a06-72ce-4b87-be28-e71501ec9291\") " pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.002385 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.150351 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4"] Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.160909 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.160952 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.160987 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.161619 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cb5a4bc85d48be6eae743481c416969d22a0b71074f3b32022fe8457dc9c32b"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.161669 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://4cb5a4bc85d48be6eae743481c416969d22a0b71074f3b32022fe8457dc9c32b" gracePeriod=600 Jan 30 21:32:04 crc kubenswrapper[4834]: W0130 21:32:04.223533 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb5dfdf6_696e_40f9_95a6_baa1b909a02e.slice/crio-b56582f3e3d60e56df75e8cce9e192266033c088f70a62977117e15552d477d1 WatchSource:0}: Error finding container b56582f3e3d60e56df75e8cce9e192266033c088f70a62977117e15552d477d1: Status 404 returned error can't find the container with id b56582f3e3d60e56df75e8cce9e192266033c088f70a62977117e15552d477d1 Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.270621 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q"] Jan 30 21:32:04 crc kubenswrapper[4834]: W0130 21:32:04.277345 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcce27a06_72ce_4b87_be28_e71501ec9291.slice/crio-5f9ecf80aa7c9b1ea0d5d7e107081828b029284b7bc4553d7dd4537009016168 WatchSource:0}: Error finding container 5f9ecf80aa7c9b1ea0d5d7e107081828b029284b7bc4553d7dd4537009016168: Status 404 returned error can't find the container with id 5f9ecf80aa7c9b1ea0d5d7e107081828b029284b7bc4553d7dd4537009016168 Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.597675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" event={"ID":"db5dfdf6-696e-40f9-95a6-baa1b909a02e","Type":"ContainerStarted","Data":"b56582f3e3d60e56df75e8cce9e192266033c088f70a62977117e15552d477d1"} Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.598984 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" event={"ID":"cce27a06-72ce-4b87-be28-e71501ec9291","Type":"ContainerStarted","Data":"5f9ecf80aa7c9b1ea0d5d7e107081828b029284b7bc4553d7dd4537009016168"} Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.601075 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="4cb5a4bc85d48be6eae743481c416969d22a0b71074f3b32022fe8457dc9c32b" exitCode=0 Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.601107 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"4cb5a4bc85d48be6eae743481c416969d22a0b71074f3b32022fe8457dc9c32b"} Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.601124 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"75ffc2f37f0663828c033fce2d59c1e7b940cbd240e347042d2341fa7fcc4ac6"} Jan 30 21:32:04 crc kubenswrapper[4834]: I0130 21:32:04.601139 4834 scope.go:117] "RemoveContainer" containerID="76c0ed7f9e9f321f65e1f6d65b4089c7729795a844b8db9bc32d4d6eeeb8f6b8" Jan 30 21:32:09 crc kubenswrapper[4834]: I0130 21:32:09.643476 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" event={"ID":"db5dfdf6-696e-40f9-95a6-baa1b909a02e","Type":"ContainerStarted","Data":"252f75de27314539c5cbcbcdd6823427017f49f8141441e4bd436bf1e2edf4b5"} Jan 30 21:32:09 crc kubenswrapper[4834]: I0130 21:32:09.643965 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:09 crc kubenswrapper[4834]: I0130 21:32:09.647577 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" event={"ID":"cce27a06-72ce-4b87-be28-e71501ec9291","Type":"ContainerStarted","Data":"f4db734a011cb5785f77cbf24c6ece72ad0cefa2113bdffd23efdb843b0cadce"} Jan 30 21:32:09 crc kubenswrapper[4834]: I0130 21:32:09.647806 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:09 crc kubenswrapper[4834]: I0130 21:32:09.680753 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" podStartSLOduration=2.334379834 podStartE2EDuration="6.680721176s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:04.224842692 +0000 UTC m=+975.377988830" lastFinishedPulling="2026-01-30 21:32:08.571184034 +0000 UTC m=+979.724330172" observedRunningTime="2026-01-30 21:32:09.672104954 +0000 UTC m=+980.825251172" watchObservedRunningTime="2026-01-30 21:32:09.680721176 +0000 UTC m=+980.833867354" Jan 30 21:32:09 crc kubenswrapper[4834]: I0130 21:32:09.730007 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" podStartSLOduration=2.426021016 podStartE2EDuration="6.729981349s" podCreationTimestamp="2026-01-30 21:32:03 +0000 UTC" firstStartedPulling="2026-01-30 21:32:04.281112981 +0000 UTC m=+975.434259119" lastFinishedPulling="2026-01-30 21:32:08.585073314 +0000 UTC m=+979.738219452" observedRunningTime="2026-01-30 21:32:09.720271706 +0000 UTC m=+980.873417854" watchObservedRunningTime="2026-01-30 21:32:09.729981349 +0000 UTC m=+980.883127527" Jan 30 21:32:24 crc kubenswrapper[4834]: I0130 21:32:24.006706 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6c67b9b9df-6cd7q" Jan 30 21:32:43 crc kubenswrapper[4834]: I0130 21:32:43.693933 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6dfb4f7bb8-zhhz4" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.507959 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j"] Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.508961 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.510904 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.511340 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vm7rq" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.518771 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6xr9z"] Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.523186 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.525210 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.525672 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.528159 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j"] Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-startup\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597282 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597327 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-sockets\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597357 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-reloader\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597377 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics-certs\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597414 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-conf\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2czw\" (UniqueName: \"kubernetes.io/projected/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-kube-api-access-p2czw\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597509 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfxfk\" (UniqueName: \"kubernetes.io/projected/df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8-kube-api-access-xfxfk\") pod \"frr-k8s-webhook-server-7df86c4f6c-kj45j\" (UID: \"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.597576 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kj45j\" (UID: \"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.604449 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dlj42"] Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.607507 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.609579 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.609769 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.609909 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.610024 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5qxjh" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.630327 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-gmxj2"] Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.631614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.636043 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.655702 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-gmxj2"] Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698805 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698867 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kj45j\" (UID: \"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698901 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-startup\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698922 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-metrics-certs\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698946 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-sockets\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698984 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-conf\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.698999 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-reloader\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.699013 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics-certs\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.699038 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpzcx\" (UniqueName: \"kubernetes.io/projected/2650188e-fece-45c8-a478-c8ea2ec1552d-kube-api-access-xpzcx\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.699071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2czw\" (UniqueName: \"kubernetes.io/projected/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-kube-api-access-p2czw\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.699088 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2650188e-fece-45c8-a478-c8ea2ec1552d-metallb-excludel2\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.699110 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfxfk\" (UniqueName: \"kubernetes.io/projected/df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8-kube-api-access-xfxfk\") pod \"frr-k8s-webhook-server-7df86c4f6c-kj45j\" (UID: \"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.700912 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-reloader\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: E0130 21:32:44.700990 4834 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 30 21:32:44 crc kubenswrapper[4834]: E0130 21:32:44.701035 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics-certs podName:ff4a3ae3-827a-4874-9d7e-0be7bb7548a3 nodeName:}" failed. No retries permitted until 2026-01-30 21:32:45.201021091 +0000 UTC m=+1016.354167229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics-certs") pod "frr-k8s-6xr9z" (UID: "ff4a3ae3-827a-4874-9d7e-0be7bb7548a3") : secret "frr-k8s-certs-secret" not found Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.701166 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-conf\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.701311 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.701432 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-sockets\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.701746 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-frr-startup\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.711108 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kj45j\" (UID: \"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.718157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfxfk\" (UniqueName: \"kubernetes.io/projected/df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8-kube-api-access-xfxfk\") pod \"frr-k8s-webhook-server-7df86c4f6c-kj45j\" (UID: \"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.722774 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2czw\" (UniqueName: \"kubernetes.io/projected/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-kube-api-access-p2czw\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800183 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z76qs\" (UniqueName: \"kubernetes.io/projected/aaffa237-4e73-45f7-9a04-e0cd97abc541-kube-api-access-z76qs\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800247 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2650188e-fece-45c8-a478-c8ea2ec1552d-metallb-excludel2\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: E0130 21:32:44.800385 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 21:32:44 crc kubenswrapper[4834]: E0130 21:32:44.800450 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist podName:2650188e-fece-45c8-a478-c8ea2ec1552d nodeName:}" failed. No retries permitted until 2026-01-30 21:32:45.300433782 +0000 UTC m=+1016.453579920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist") pod "speaker-dlj42" (UID: "2650188e-fece-45c8-a478-c8ea2ec1552d") : secret "metallb-memberlist" not found Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800434 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaffa237-4e73-45f7-9a04-e0cd97abc541-cert\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-metrics-certs\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800725 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaffa237-4e73-45f7-9a04-e0cd97abc541-metrics-certs\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.800848 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpzcx\" (UniqueName: \"kubernetes.io/projected/2650188e-fece-45c8-a478-c8ea2ec1552d-kube-api-access-xpzcx\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.801211 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2650188e-fece-45c8-a478-c8ea2ec1552d-metallb-excludel2\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: E0130 21:32:44.801282 4834 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 30 21:32:44 crc kubenswrapper[4834]: E0130 21:32:44.801310 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-metrics-certs podName:2650188e-fece-45c8-a478-c8ea2ec1552d nodeName:}" failed. No retries permitted until 2026-01-30 21:32:45.301303106 +0000 UTC m=+1016.454449244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-metrics-certs") pod "speaker-dlj42" (UID: "2650188e-fece-45c8-a478-c8ea2ec1552d") : secret "speaker-certs-secret" not found Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.818783 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpzcx\" (UniqueName: \"kubernetes.io/projected/2650188e-fece-45c8-a478-c8ea2ec1552d-kube-api-access-xpzcx\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.841669 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.901937 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaffa237-4e73-45f7-9a04-e0cd97abc541-cert\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.902036 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaffa237-4e73-45f7-9a04-e0cd97abc541-metrics-certs\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.902093 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z76qs\" (UniqueName: \"kubernetes.io/projected/aaffa237-4e73-45f7-9a04-e0cd97abc541-kube-api-access-z76qs\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.906246 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaffa237-4e73-45f7-9a04-e0cd97abc541-cert\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.906704 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaffa237-4e73-45f7-9a04-e0cd97abc541-metrics-certs\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.918102 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z76qs\" (UniqueName: \"kubernetes.io/projected/aaffa237-4e73-45f7-9a04-e0cd97abc541-kube-api-access-z76qs\") pod \"controller-6968d8fdc4-gmxj2\" (UID: \"aaffa237-4e73-45f7-9a04-e0cd97abc541\") " pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:44 crc kubenswrapper[4834]: I0130 21:32:44.944145 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.205144 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics-certs\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.209083 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff4a3ae3-827a-4874-9d7e-0be7bb7548a3-metrics-certs\") pod \"frr-k8s-6xr9z\" (UID: \"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3\") " pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.273964 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j"] Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.284158 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.306923 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.307002 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-metrics-certs\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:45 crc kubenswrapper[4834]: E0130 21:32:45.307710 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 30 21:32:45 crc kubenswrapper[4834]: E0130 21:32:45.307773 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist podName:2650188e-fece-45c8-a478-c8ea2ec1552d nodeName:}" failed. No retries permitted until 2026-01-30 21:32:46.307755671 +0000 UTC m=+1017.460901819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist") pod "speaker-dlj42" (UID: "2650188e-fece-45c8-a478-c8ea2ec1552d") : secret "metallb-memberlist" not found Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.311487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-metrics-certs\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.377214 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-gmxj2"] Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.450140 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.934718 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"b4d83862f00a7fb5ca9a7ef20906d5caadc961bf746087f20f06f3c303377e7a"} Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.936643 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" event={"ID":"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8","Type":"ContainerStarted","Data":"1219a6d24d19ee0ec96c810b31b20b0ccef0af7e32fdd9ca36ee23f180bb7412"} Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.939730 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-gmxj2" event={"ID":"aaffa237-4e73-45f7-9a04-e0cd97abc541","Type":"ContainerStarted","Data":"e6e7040f5805d78350582419f63af5fe2e9f9260ff2795bb1265e6befa1a2794"} Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.939779 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-gmxj2" event={"ID":"aaffa237-4e73-45f7-9a04-e0cd97abc541","Type":"ContainerStarted","Data":"f62c6c2bc5654b16841073a4ec1968e777c26faedb2db2f34aefe910bce718d2"} Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.939799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-gmxj2" event={"ID":"aaffa237-4e73-45f7-9a04-e0cd97abc541","Type":"ContainerStarted","Data":"fa3467d93f4d08036ab18f1d18c4aa7cefd87ac61b525738c5d21b4aa7839081"} Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.939914 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:32:45 crc kubenswrapper[4834]: I0130 21:32:45.969827 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-gmxj2" podStartSLOduration=1.969799002 podStartE2EDuration="1.969799002s" podCreationTimestamp="2026-01-30 21:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:45.961910671 +0000 UTC m=+1017.115056849" watchObservedRunningTime="2026-01-30 21:32:45.969799002 +0000 UTC m=+1017.122945170" Jan 30 21:32:46 crc kubenswrapper[4834]: I0130 21:32:46.322265 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:46 crc kubenswrapper[4834]: I0130 21:32:46.330156 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2650188e-fece-45c8-a478-c8ea2ec1552d-memberlist\") pod \"speaker-dlj42\" (UID: \"2650188e-fece-45c8-a478-c8ea2ec1552d\") " pod="metallb-system/speaker-dlj42" Jan 30 21:32:46 crc kubenswrapper[4834]: I0130 21:32:46.423752 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dlj42" Jan 30 21:32:46 crc kubenswrapper[4834]: W0130 21:32:46.457964 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2650188e_fece_45c8_a478_c8ea2ec1552d.slice/crio-b585b7a3049f90299a372594dd8ecafabbd958bcad7280aa9f076a85499d021a WatchSource:0}: Error finding container b585b7a3049f90299a372594dd8ecafabbd958bcad7280aa9f076a85499d021a: Status 404 returned error can't find the container with id b585b7a3049f90299a372594dd8ecafabbd958bcad7280aa9f076a85499d021a Jan 30 21:32:46 crc kubenswrapper[4834]: I0130 21:32:46.948271 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dlj42" event={"ID":"2650188e-fece-45c8-a478-c8ea2ec1552d","Type":"ContainerStarted","Data":"3e2d01f9f0132ea2ef1a4bb9b5aa8ea2448a3b7e8b30b5a1e6aa7fbf227ed285"} Jan 30 21:32:46 crc kubenswrapper[4834]: I0130 21:32:46.948562 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dlj42" event={"ID":"2650188e-fece-45c8-a478-c8ea2ec1552d","Type":"ContainerStarted","Data":"b585b7a3049f90299a372594dd8ecafabbd958bcad7280aa9f076a85499d021a"} Jan 30 21:32:47 crc kubenswrapper[4834]: I0130 21:32:47.956842 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dlj42" event={"ID":"2650188e-fece-45c8-a478-c8ea2ec1552d","Type":"ContainerStarted","Data":"03a24132d9469bf06a6c31f223602625e3df7880358fee33ab0be565d176853d"} Jan 30 21:32:47 crc kubenswrapper[4834]: I0130 21:32:47.956991 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dlj42" Jan 30 21:32:47 crc kubenswrapper[4834]: I0130 21:32:47.977598 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dlj42" podStartSLOduration=3.9775820570000002 podStartE2EDuration="3.977582057s" podCreationTimestamp="2026-01-30 21:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:32:47.974970524 +0000 UTC m=+1019.128116662" watchObservedRunningTime="2026-01-30 21:32:47.977582057 +0000 UTC m=+1019.130728195" Jan 30 21:32:54 crc kubenswrapper[4834]: I0130 21:32:54.041753 4834 generic.go:334] "Generic (PLEG): container finished" podID="ff4a3ae3-827a-4874-9d7e-0be7bb7548a3" containerID="41d41c37be9022788962b71b83455e006646badaf5d015bf5df4b2c06b8cefbf" exitCode=0 Jan 30 21:32:54 crc kubenswrapper[4834]: I0130 21:32:54.041875 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerDied","Data":"41d41c37be9022788962b71b83455e006646badaf5d015bf5df4b2c06b8cefbf"} Jan 30 21:32:54 crc kubenswrapper[4834]: I0130 21:32:54.044895 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" event={"ID":"df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8","Type":"ContainerStarted","Data":"2c063fa02b42b6bb6a04c972d77e279cf75186840f24848e94fe7cda33bb5a05"} Jan 30 21:32:54 crc kubenswrapper[4834]: I0130 21:32:54.045557 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:32:55 crc kubenswrapper[4834]: I0130 21:32:55.058941 4834 generic.go:334] "Generic (PLEG): container finished" podID="ff4a3ae3-827a-4874-9d7e-0be7bb7548a3" containerID="b9932a43678f6c36b1b0da04a2bebded47bae7ac7f2175cdcd57259851489840" exitCode=0 Jan 30 21:32:55 crc kubenswrapper[4834]: I0130 21:32:55.059225 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerDied","Data":"b9932a43678f6c36b1b0da04a2bebded47bae7ac7f2175cdcd57259851489840"} Jan 30 21:32:55 crc kubenswrapper[4834]: I0130 21:32:55.092165 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" podStartSLOduration=2.944936814 podStartE2EDuration="11.092146528s" podCreationTimestamp="2026-01-30 21:32:44 +0000 UTC" firstStartedPulling="2026-01-30 21:32:45.283956893 +0000 UTC m=+1016.437103031" lastFinishedPulling="2026-01-30 21:32:53.431166597 +0000 UTC m=+1024.584312745" observedRunningTime="2026-01-30 21:32:54.112437309 +0000 UTC m=+1025.265583447" watchObservedRunningTime="2026-01-30 21:32:55.092146528 +0000 UTC m=+1026.245292666" Jan 30 21:32:56 crc kubenswrapper[4834]: I0130 21:32:56.071603 4834 generic.go:334] "Generic (PLEG): container finished" podID="ff4a3ae3-827a-4874-9d7e-0be7bb7548a3" containerID="99e3dfd131656a7551faef5a96f00ee2abf34bd7d620886386755fb580457f09" exitCode=0 Jan 30 21:32:56 crc kubenswrapper[4834]: I0130 21:32:56.071698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerDied","Data":"99e3dfd131656a7551faef5a96f00ee2abf34bd7d620886386755fb580457f09"} Jan 30 21:32:56 crc kubenswrapper[4834]: I0130 21:32:56.428722 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dlj42" Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.086962 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"5059c3d8909cffd8520cce8d32c91f741d08447365cc3e3a14f9a70fe226a8e6"} Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.087197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"3cbbf367e235e20ec1fff32f84520f5864b08cff62fabf4904359f5410f5495d"} Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.087209 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"e6038602f784147e4b26d944540ebdccdf09c4ad0b083f04848c579634bfea62"} Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.087218 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"184703972e99e0a118c3225908c0bd754815fbb2a5665a3a35b63a9e6ca582a6"} Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.087363 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.087376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"1c5ba3a119c3a405eba62220182ea69a7396ed04a7d9a6dd9e701da36b21a83b"} Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.087386 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xr9z" event={"ID":"ff4a3ae3-827a-4874-9d7e-0be7bb7548a3","Type":"ContainerStarted","Data":"3656d0fdcd42a5e157277482026e5d45332e97ee6bef3b710397c6c23036cc9c"} Jan 30 21:32:57 crc kubenswrapper[4834]: I0130 21:32:57.115296 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6xr9z" podStartSLOduration=5.229819004 podStartE2EDuration="13.115274062s" podCreationTimestamp="2026-01-30 21:32:44 +0000 UTC" firstStartedPulling="2026-01-30 21:32:45.573367675 +0000 UTC m=+1016.726513813" lastFinishedPulling="2026-01-30 21:32:53.458822723 +0000 UTC m=+1024.611968871" observedRunningTime="2026-01-30 21:32:57.109213302 +0000 UTC m=+1028.262359470" watchObservedRunningTime="2026-01-30 21:32:57.115274062 +0000 UTC m=+1028.268420240" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.205239 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fbvn4"] Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.206563 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.216640 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hpl4q" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.217719 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.224834 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.228320 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fbvn4"] Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.340153 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb99g\" (UniqueName: \"kubernetes.io/projected/cf761dbb-3136-4e00-9c4f-b38b817efa96-kube-api-access-zb99g\") pod \"openstack-operator-index-fbvn4\" (UID: \"cf761dbb-3136-4e00-9c4f-b38b817efa96\") " pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.441138 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb99g\" (UniqueName: \"kubernetes.io/projected/cf761dbb-3136-4e00-9c4f-b38b817efa96-kube-api-access-zb99g\") pod \"openstack-operator-index-fbvn4\" (UID: \"cf761dbb-3136-4e00-9c4f-b38b817efa96\") " pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.463052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb99g\" (UniqueName: \"kubernetes.io/projected/cf761dbb-3136-4e00-9c4f-b38b817efa96-kube-api-access-zb99g\") pod \"openstack-operator-index-fbvn4\" (UID: \"cf761dbb-3136-4e00-9c4f-b38b817efa96\") " pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.540425 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:32:59 crc kubenswrapper[4834]: I0130 21:32:59.935900 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fbvn4"] Jan 30 21:32:59 crc kubenswrapper[4834]: W0130 21:32:59.939553 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf761dbb_3136_4e00_9c4f_b38b817efa96.slice/crio-1251765dab9903b051409b216bfc4979db794917f4fe8a48c20b58e68fdd4e33 WatchSource:0}: Error finding container 1251765dab9903b051409b216bfc4979db794917f4fe8a48c20b58e68fdd4e33: Status 404 returned error can't find the container with id 1251765dab9903b051409b216bfc4979db794917f4fe8a48c20b58e68fdd4e33 Jan 30 21:33:00 crc kubenswrapper[4834]: I0130 21:33:00.111337 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fbvn4" event={"ID":"cf761dbb-3136-4e00-9c4f-b38b817efa96","Type":"ContainerStarted","Data":"1251765dab9903b051409b216bfc4979db794917f4fe8a48c20b58e68fdd4e33"} Jan 30 21:33:00 crc kubenswrapper[4834]: I0130 21:33:00.451200 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:33:00 crc kubenswrapper[4834]: I0130 21:33:00.500641 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:33:01 crc kubenswrapper[4834]: I0130 21:33:01.977781 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fbvn4"] Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.592188 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cchsg"] Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.593926 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.600477 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cchsg"] Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.707343 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n2j9\" (UniqueName: \"kubernetes.io/projected/7337253d-358a-4bdc-8f28-0a0aad4afe6b-kube-api-access-6n2j9\") pod \"openstack-operator-index-cchsg\" (UID: \"7337253d-358a-4bdc-8f28-0a0aad4afe6b\") " pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.816559 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n2j9\" (UniqueName: \"kubernetes.io/projected/7337253d-358a-4bdc-8f28-0a0aad4afe6b-kube-api-access-6n2j9\") pod \"openstack-operator-index-cchsg\" (UID: \"7337253d-358a-4bdc-8f28-0a0aad4afe6b\") " pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.864487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n2j9\" (UniqueName: \"kubernetes.io/projected/7337253d-358a-4bdc-8f28-0a0aad4afe6b-kube-api-access-6n2j9\") pod \"openstack-operator-index-cchsg\" (UID: \"7337253d-358a-4bdc-8f28-0a0aad4afe6b\") " pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:02 crc kubenswrapper[4834]: I0130 21:33:02.923249 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:03 crc kubenswrapper[4834]: I0130 21:33:03.451276 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cchsg"] Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.156734 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cchsg" event={"ID":"7337253d-358a-4bdc-8f28-0a0aad4afe6b","Type":"ContainerStarted","Data":"cc50b18ebf35d7050d875e7364b250ed4ef2b0a3c7fa715c22004abdb13535bc"} Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.157041 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cchsg" event={"ID":"7337253d-358a-4bdc-8f28-0a0aad4afe6b","Type":"ContainerStarted","Data":"779b7eae1027df575cfb68d5a93c42351ddce84f7b8b500b52f10f929a4f121c"} Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.158727 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fbvn4" event={"ID":"cf761dbb-3136-4e00-9c4f-b38b817efa96","Type":"ContainerStarted","Data":"9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1"} Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.158816 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-fbvn4" podUID="cf761dbb-3136-4e00-9c4f-b38b817efa96" containerName="registry-server" containerID="cri-o://9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1" gracePeriod=2 Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.181292 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cchsg" podStartSLOduration=2.103841596 podStartE2EDuration="2.181271209s" podCreationTimestamp="2026-01-30 21:33:02 +0000 UTC" firstStartedPulling="2026-01-30 21:33:03.46270484 +0000 UTC m=+1034.615850988" lastFinishedPulling="2026-01-30 21:33:03.540134463 +0000 UTC m=+1034.693280601" observedRunningTime="2026-01-30 21:33:04.175739323 +0000 UTC m=+1035.328885491" watchObservedRunningTime="2026-01-30 21:33:04.181271209 +0000 UTC m=+1035.334417357" Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.192441 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fbvn4" podStartSLOduration=2.049013633 podStartE2EDuration="5.192417942s" podCreationTimestamp="2026-01-30 21:32:59 +0000 UTC" firstStartedPulling="2026-01-30 21:32:59.941448166 +0000 UTC m=+1031.094594304" lastFinishedPulling="2026-01-30 21:33:03.084852445 +0000 UTC m=+1034.237998613" observedRunningTime="2026-01-30 21:33:04.189367076 +0000 UTC m=+1035.342513224" watchObservedRunningTime="2026-01-30 21:33:04.192417942 +0000 UTC m=+1035.345564120" Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.721439 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.847077 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb99g\" (UniqueName: \"kubernetes.io/projected/cf761dbb-3136-4e00-9c4f-b38b817efa96-kube-api-access-zb99g\") pod \"cf761dbb-3136-4e00-9c4f-b38b817efa96\" (UID: \"cf761dbb-3136-4e00-9c4f-b38b817efa96\") " Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.849910 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kj45j" Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.854681 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf761dbb-3136-4e00-9c4f-b38b817efa96-kube-api-access-zb99g" (OuterVolumeSpecName: "kube-api-access-zb99g") pod "cf761dbb-3136-4e00-9c4f-b38b817efa96" (UID: "cf761dbb-3136-4e00-9c4f-b38b817efa96"). InnerVolumeSpecName "kube-api-access-zb99g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.952001 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb99g\" (UniqueName: \"kubernetes.io/projected/cf761dbb-3136-4e00-9c4f-b38b817efa96-kube-api-access-zb99g\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:04 crc kubenswrapper[4834]: I0130 21:33:04.952891 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-gmxj2" Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.170267 4834 generic.go:334] "Generic (PLEG): container finished" podID="cf761dbb-3136-4e00-9c4f-b38b817efa96" containerID="9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1" exitCode=0 Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.170359 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fbvn4" Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.170431 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fbvn4" event={"ID":"cf761dbb-3136-4e00-9c4f-b38b817efa96","Type":"ContainerDied","Data":"9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1"} Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.170530 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fbvn4" event={"ID":"cf761dbb-3136-4e00-9c4f-b38b817efa96","Type":"ContainerDied","Data":"1251765dab9903b051409b216bfc4979db794917f4fe8a48c20b58e68fdd4e33"} Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.170585 4834 scope.go:117] "RemoveContainer" containerID="9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1" Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.196783 4834 scope.go:117] "RemoveContainer" containerID="9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1" Jan 30 21:33:05 crc kubenswrapper[4834]: E0130 21:33:05.197282 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1\": container with ID starting with 9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1 not found: ID does not exist" containerID="9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1" Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.197320 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1"} err="failed to get container status \"9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1\": rpc error: code = NotFound desc = could not find container \"9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1\": container with ID starting with 9e404825e37745c55e44d782c432ed25fcd3369d9287cf3e05fd2dc4b6b496f1 not found: ID does not exist" Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.216248 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fbvn4"] Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.226629 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-fbvn4"] Jan 30 21:33:05 crc kubenswrapper[4834]: I0130 21:33:05.538064 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf761dbb-3136-4e00-9c4f-b38b817efa96" path="/var/lib/kubelet/pods/cf761dbb-3136-4e00-9c4f-b38b817efa96/volumes" Jan 30 21:33:12 crc kubenswrapper[4834]: I0130 21:33:12.923674 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:12 crc kubenswrapper[4834]: I0130 21:33:12.925834 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:12 crc kubenswrapper[4834]: I0130 21:33:12.997357 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:13 crc kubenswrapper[4834]: I0130 21:33:13.279024 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cchsg" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.835455 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5"] Jan 30 21:33:14 crc kubenswrapper[4834]: E0130 21:33:14.836348 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf761dbb-3136-4e00-9c4f-b38b817efa96" containerName="registry-server" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.836380 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf761dbb-3136-4e00-9c4f-b38b817efa96" containerName="registry-server" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.836755 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf761dbb-3136-4e00-9c4f-b38b817efa96" containerName="registry-server" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.839022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.843243 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5"] Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.860686 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w84wv" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.986318 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-bundle\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.986442 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-util\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:14 crc kubenswrapper[4834]: I0130 21:33:14.986520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmp6t\" (UniqueName: \"kubernetes.io/projected/fbe6ba8c-1c28-4778-ba40-bb22671864ed-kube-api-access-dmp6t\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.087900 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-bundle\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.087983 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-util\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.088054 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmp6t\" (UniqueName: \"kubernetes.io/projected/fbe6ba8c-1c28-4778-ba40-bb22671864ed-kube-api-access-dmp6t\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.089043 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-util\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.089074 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-bundle\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.111835 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmp6t\" (UniqueName: \"kubernetes.io/projected/fbe6ba8c-1c28-4778-ba40-bb22671864ed-kube-api-access-dmp6t\") pod \"c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.161504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.455956 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6xr9z" Jan 30 21:33:15 crc kubenswrapper[4834]: I0130 21:33:15.612264 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5"] Jan 30 21:33:15 crc kubenswrapper[4834]: W0130 21:33:15.618983 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe6ba8c_1c28_4778_ba40_bb22671864ed.slice/crio-fd8f9e1224e1c7a78bcb1efa02068e79181efe2b5699a0704ad140bb39f0c8ba WatchSource:0}: Error finding container fd8f9e1224e1c7a78bcb1efa02068e79181efe2b5699a0704ad140bb39f0c8ba: Status 404 returned error can't find the container with id fd8f9e1224e1c7a78bcb1efa02068e79181efe2b5699a0704ad140bb39f0c8ba Jan 30 21:33:16 crc kubenswrapper[4834]: I0130 21:33:16.269703 4834 generic.go:334] "Generic (PLEG): container finished" podID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerID="1733fa9c0817a8e20f9c806e578fd02b88955fac140fb825544f7d07933a7d96" exitCode=0 Jan 30 21:33:16 crc kubenswrapper[4834]: I0130 21:33:16.269774 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" event={"ID":"fbe6ba8c-1c28-4778-ba40-bb22671864ed","Type":"ContainerDied","Data":"1733fa9c0817a8e20f9c806e578fd02b88955fac140fb825544f7d07933a7d96"} Jan 30 21:33:16 crc kubenswrapper[4834]: I0130 21:33:16.269821 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" event={"ID":"fbe6ba8c-1c28-4778-ba40-bb22671864ed","Type":"ContainerStarted","Data":"fd8f9e1224e1c7a78bcb1efa02068e79181efe2b5699a0704ad140bb39f0c8ba"} Jan 30 21:33:18 crc kubenswrapper[4834]: I0130 21:33:18.289129 4834 generic.go:334] "Generic (PLEG): container finished" podID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerID="2f1e51de444e2aadc4379d0ac9e6df5e5806d117ab7ac69f1d7f0b87f2067ee9" exitCode=0 Jan 30 21:33:18 crc kubenswrapper[4834]: I0130 21:33:18.289216 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" event={"ID":"fbe6ba8c-1c28-4778-ba40-bb22671864ed","Type":"ContainerDied","Data":"2f1e51de444e2aadc4379d0ac9e6df5e5806d117ab7ac69f1d7f0b87f2067ee9"} Jan 30 21:33:19 crc kubenswrapper[4834]: I0130 21:33:19.299867 4834 generic.go:334] "Generic (PLEG): container finished" podID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerID="2d585d59143e0fbc03bd3a7317e87959cc0d11e7010e1abd9d5531c4e2e956df" exitCode=0 Jan 30 21:33:19 crc kubenswrapper[4834]: I0130 21:33:19.300005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" event={"ID":"fbe6ba8c-1c28-4778-ba40-bb22671864ed","Type":"ContainerDied","Data":"2d585d59143e0fbc03bd3a7317e87959cc0d11e7010e1abd9d5531c4e2e956df"} Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.686571 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.778500 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-util\") pod \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.778622 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-bundle\") pod \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.778656 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmp6t\" (UniqueName: \"kubernetes.io/projected/fbe6ba8c-1c28-4778-ba40-bb22671864ed-kube-api-access-dmp6t\") pod \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\" (UID: \"fbe6ba8c-1c28-4778-ba40-bb22671864ed\") " Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.779870 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-bundle" (OuterVolumeSpecName: "bundle") pod "fbe6ba8c-1c28-4778-ba40-bb22671864ed" (UID: "fbe6ba8c-1c28-4778-ba40-bb22671864ed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.786033 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe6ba8c-1c28-4778-ba40-bb22671864ed-kube-api-access-dmp6t" (OuterVolumeSpecName: "kube-api-access-dmp6t") pod "fbe6ba8c-1c28-4778-ba40-bb22671864ed" (UID: "fbe6ba8c-1c28-4778-ba40-bb22671864ed"). InnerVolumeSpecName "kube-api-access-dmp6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.801623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-util" (OuterVolumeSpecName: "util") pod "fbe6ba8c-1c28-4778-ba40-bb22671864ed" (UID: "fbe6ba8c-1c28-4778-ba40-bb22671864ed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.880732 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-util\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.880812 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fbe6ba8c-1c28-4778-ba40-bb22671864ed-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:20 crc kubenswrapper[4834]: I0130 21:33:20.880832 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmp6t\" (UniqueName: \"kubernetes.io/projected/fbe6ba8c-1c28-4778-ba40-bb22671864ed-kube-api-access-dmp6t\") on node \"crc\" DevicePath \"\"" Jan 30 21:33:21 crc kubenswrapper[4834]: I0130 21:33:21.317117 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" event={"ID":"fbe6ba8c-1c28-4778-ba40-bb22671864ed","Type":"ContainerDied","Data":"fd8f9e1224e1c7a78bcb1efa02068e79181efe2b5699a0704ad140bb39f0c8ba"} Jan 30 21:33:21 crc kubenswrapper[4834]: I0130 21:33:21.317461 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd8f9e1224e1c7a78bcb1efa02068e79181efe2b5699a0704ad140bb39f0c8ba" Jan 30 21:33:21 crc kubenswrapper[4834]: I0130 21:33:21.317196 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.152598 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn"] Jan 30 21:33:26 crc kubenswrapper[4834]: E0130 21:33:26.153355 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="extract" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.153371 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="extract" Jan 30 21:33:26 crc kubenswrapper[4834]: E0130 21:33:26.153405 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="pull" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.153413 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="pull" Jan 30 21:33:26 crc kubenswrapper[4834]: E0130 21:33:26.153425 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="util" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.153433 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="util" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.153605 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe6ba8c-1c28-4778-ba40-bb22671864ed" containerName="extract" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.154338 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.156562 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-ds2rj" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.187766 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn"] Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.355679 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8h6v\" (UniqueName: \"kubernetes.io/projected/e28ad8de-912f-418e-9706-56f0dd055527-kube-api-access-v8h6v\") pod \"openstack-operator-controller-init-55fdcd6c79-cwbsn\" (UID: \"e28ad8de-912f-418e-9706-56f0dd055527\") " pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.457581 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8h6v\" (UniqueName: \"kubernetes.io/projected/e28ad8de-912f-418e-9706-56f0dd055527-kube-api-access-v8h6v\") pod \"openstack-operator-controller-init-55fdcd6c79-cwbsn\" (UID: \"e28ad8de-912f-418e-9706-56f0dd055527\") " pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.477836 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8h6v\" (UniqueName: \"kubernetes.io/projected/e28ad8de-912f-418e-9706-56f0dd055527-kube-api-access-v8h6v\") pod \"openstack-operator-controller-init-55fdcd6c79-cwbsn\" (UID: \"e28ad8de-912f-418e-9706-56f0dd055527\") " pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.478228 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:26 crc kubenswrapper[4834]: I0130 21:33:26.994255 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn"] Jan 30 21:33:27 crc kubenswrapper[4834]: I0130 21:33:27.367356 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" event={"ID":"e28ad8de-912f-418e-9706-56f0dd055527","Type":"ContainerStarted","Data":"dcc1337c58a5227595baef33f7a7893f373cb08538c805f9095cd1df81ce558b"} Jan 30 21:33:32 crc kubenswrapper[4834]: I0130 21:33:32.410535 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" event={"ID":"e28ad8de-912f-418e-9706-56f0dd055527","Type":"ContainerStarted","Data":"b6bf0db633c5a994addb36374ef138ae81367a980ab80f1561d6255ece6c690c"} Jan 30 21:33:32 crc kubenswrapper[4834]: I0130 21:33:32.411218 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:32 crc kubenswrapper[4834]: I0130 21:33:32.446648 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" podStartSLOduration=2.160415981 podStartE2EDuration="6.446626986s" podCreationTimestamp="2026-01-30 21:33:26 +0000 UTC" firstStartedPulling="2026-01-30 21:33:27.002148341 +0000 UTC m=+1058.155294479" lastFinishedPulling="2026-01-30 21:33:31.288359346 +0000 UTC m=+1062.441505484" observedRunningTime="2026-01-30 21:33:32.437808798 +0000 UTC m=+1063.590954956" watchObservedRunningTime="2026-01-30 21:33:32.446626986 +0000 UTC m=+1063.599773124" Jan 30 21:33:36 crc kubenswrapper[4834]: I0130 21:33:36.482278 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-55fdcd6c79-cwbsn" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.811964 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.813164 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.815605 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ndl9w" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.823017 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.824137 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.826020 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2ggp9" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.831616 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.854378 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.860289 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.861200 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.863265 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-m9czf" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.870699 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.871610 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.873978 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6xw7w" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.913951 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk"] Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.957316 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7q5\" (UniqueName: \"kubernetes.io/projected/6371c2f9-d19b-4b87-b0db-ba05d48ea5fb-kube-api-access-sd7q5\") pod \"cinder-operator-controller-manager-8d874c8fc-4crlf\" (UID: \"6371c2f9-d19b-4b87-b0db-ba05d48ea5fb\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.957473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwwt\" (UniqueName: \"kubernetes.io/projected/9c50c180-8d91-43d0-bb07-4ea3881a1751-kube-api-access-jlwwt\") pod \"glance-operator-controller-manager-8886f4c47-x8rnk\" (UID: \"9c50c180-8d91-43d0-bb07-4ea3881a1751\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.957573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvpv\" (UniqueName: \"kubernetes.io/projected/2e538c79-bca6-46f0-a63d-fb537639f206-kube-api-access-6qvpv\") pod \"designate-operator-controller-manager-6d9697b7f4-hjdtw\" (UID: \"2e538c79-bca6-46f0-a63d-fb537639f206\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.957632 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g499t\" (UniqueName: \"kubernetes.io/projected/2156cb3c-172b-4268-86e6-64b1d40b87ed-kube-api-access-g499t\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-95x2s\" (UID: \"2156cb3c-172b-4268-86e6-64b1d40b87ed\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:33:55 crc kubenswrapper[4834]: I0130 21:33:55.979068 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.023139 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.024487 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.028076 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-4zg4l" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.046528 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.047641 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.057680 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-pb9qr" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.058515 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwwt\" (UniqueName: \"kubernetes.io/projected/9c50c180-8d91-43d0-bb07-4ea3881a1751-kube-api-access-jlwwt\") pod \"glance-operator-controller-manager-8886f4c47-x8rnk\" (UID: \"9c50c180-8d91-43d0-bb07-4ea3881a1751\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.058558 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvpv\" (UniqueName: \"kubernetes.io/projected/2e538c79-bca6-46f0-a63d-fb537639f206-kube-api-access-6qvpv\") pod \"designate-operator-controller-manager-6d9697b7f4-hjdtw\" (UID: \"2e538c79-bca6-46f0-a63d-fb537639f206\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.058594 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g499t\" (UniqueName: \"kubernetes.io/projected/2156cb3c-172b-4268-86e6-64b1d40b87ed-kube-api-access-g499t\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-95x2s\" (UID: \"2156cb3c-172b-4268-86e6-64b1d40b87ed\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.058626 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7q5\" (UniqueName: \"kubernetes.io/projected/6371c2f9-d19b-4b87-b0db-ba05d48ea5fb-kube-api-access-sd7q5\") pod \"cinder-operator-controller-manager-8d874c8fc-4crlf\" (UID: \"6371c2f9-d19b-4b87-b0db-ba05d48ea5fb\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.067822 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.094545 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g499t\" (UniqueName: \"kubernetes.io/projected/2156cb3c-172b-4268-86e6-64b1d40b87ed-kube-api-access-g499t\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-95x2s\" (UID: \"2156cb3c-172b-4268-86e6-64b1d40b87ed\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.094547 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvpv\" (UniqueName: \"kubernetes.io/projected/2e538c79-bca6-46f0-a63d-fb537639f206-kube-api-access-6qvpv\") pod \"designate-operator-controller-manager-6d9697b7f4-hjdtw\" (UID: \"2e538c79-bca6-46f0-a63d-fb537639f206\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.100007 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7q5\" (UniqueName: \"kubernetes.io/projected/6371c2f9-d19b-4b87-b0db-ba05d48ea5fb-kube-api-access-sd7q5\") pod \"cinder-operator-controller-manager-8d874c8fc-4crlf\" (UID: \"6371c2f9-d19b-4b87-b0db-ba05d48ea5fb\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.103494 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.104790 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.114227 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wpmcj" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.114431 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.115207 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwwt\" (UniqueName: \"kubernetes.io/projected/9c50c180-8d91-43d0-bb07-4ea3881a1751-kube-api-access-jlwwt\") pod \"glance-operator-controller-manager-8886f4c47-x8rnk\" (UID: \"9c50c180-8d91-43d0-bb07-4ea3881a1751\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.119024 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.125481 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.126948 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.130329 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.133684 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-9ghdq" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.141632 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.151440 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.162811 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnnbq\" (UniqueName: \"kubernetes.io/projected/88accce3-ac33-420c-aa10-b7fea0b498c3-kube-api-access-gnnbq\") pod \"heat-operator-controller-manager-69d6db494d-l2pv5\" (UID: \"88accce3-ac33-420c-aa10-b7fea0b498c3\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.162925 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b48ns\" (UniqueName: \"kubernetes.io/projected/afece73a-c2b2-4905-819f-e8c73d968968-kube-api-access-b48ns\") pod \"horizon-operator-controller-manager-5fb775575f-pkzb7\" (UID: \"afece73a-c2b2-4905-819f-e8c73d968968\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.168628 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.169975 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.170720 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.176734 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-sjbr7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.180849 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.181678 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.182367 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.188210 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zzjtd" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.203058 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.211630 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.225612 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.226622 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.234893 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-djpwc" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.242765 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.251243 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.261692 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-889p5"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.262579 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264322 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264358 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbz6l\" (UniqueName: \"kubernetes.io/projected/b0b534e5-be84-4fd0-a8f6-ee233988095e-kube-api-access-jbz6l\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264388 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcb8n\" (UniqueName: \"kubernetes.io/projected/cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73-kube-api-access-vcb8n\") pod \"ironic-operator-controller-manager-5f4b8bd54d-xqpmp\" (UID: \"cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264425 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b48ns\" (UniqueName: \"kubernetes.io/projected/afece73a-c2b2-4905-819f-e8c73d968968-kube-api-access-b48ns\") pod \"horizon-operator-controller-manager-5fb775575f-pkzb7\" (UID: \"afece73a-c2b2-4905-819f-e8c73d968968\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264453 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnbq\" (UniqueName: \"kubernetes.io/projected/88accce3-ac33-420c-aa10-b7fea0b498c3-kube-api-access-gnnbq\") pod \"heat-operator-controller-manager-69d6db494d-l2pv5\" (UID: \"88accce3-ac33-420c-aa10-b7fea0b498c3\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264494 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5sq\" (UniqueName: \"kubernetes.io/projected/d6aef56e-f8a3-4400-b28f-5bd40a323c73-kube-api-access-kp5sq\") pod \"manila-operator-controller-manager-7dd968899f-ccd87\" (UID: \"d6aef56e-f8a3-4400-b28f-5bd40a323c73\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.264520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncvv\" (UniqueName: \"kubernetes.io/projected/ec5f427f-4be8-4066-a817-c9e2e3df4e6f-kube-api-access-wncvv\") pod \"keystone-operator-controller-manager-84f48565d4-n4kck\" (UID: \"ec5f427f-4be8-4066-a817-c9e2e3df4e6f\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.268592 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-6mxch" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.285707 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-lllql"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.287824 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.295696 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.300103 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.308921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnbq\" (UniqueName: \"kubernetes.io/projected/88accce3-ac33-420c-aa10-b7fea0b498c3-kube-api-access-gnnbq\") pod \"heat-operator-controller-manager-69d6db494d-l2pv5\" (UID: \"88accce3-ac33-420c-aa10-b7fea0b498c3\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.309105 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-frrn7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.309499 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b48ns\" (UniqueName: \"kubernetes.io/projected/afece73a-c2b2-4905-819f-e8c73d968968-kube-api-access-b48ns\") pod \"horizon-operator-controller-manager-5fb775575f-pkzb7\" (UID: \"afece73a-c2b2-4905-819f-e8c73d968968\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.309782 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-wdt6f" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.332669 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-lllql"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.355655 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-889p5"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366362 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5sq\" (UniqueName: \"kubernetes.io/projected/d6aef56e-f8a3-4400-b28f-5bd40a323c73-kube-api-access-kp5sq\") pod \"manila-operator-controller-manager-7dd968899f-ccd87\" (UID: \"d6aef56e-f8a3-4400-b28f-5bd40a323c73\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8zjh\" (UniqueName: \"kubernetes.io/projected/d68f882f-c07c-4022-a6fa-f4814f313870-kube-api-access-q8zjh\") pod \"mariadb-operator-controller-manager-67bf948998-vc2wt\" (UID: \"d68f882f-c07c-4022-a6fa-f4814f313870\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wncvv\" (UniqueName: \"kubernetes.io/projected/ec5f427f-4be8-4066-a817-c9e2e3df4e6f-kube-api-access-wncvv\") pod \"keystone-operator-controller-manager-84f48565d4-n4kck\" (UID: \"ec5f427f-4be8-4066-a817-c9e2e3df4e6f\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366473 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366499 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbz6l\" (UniqueName: \"kubernetes.io/projected/b0b534e5-be84-4fd0-a8f6-ee233988095e-kube-api-access-jbz6l\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366527 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcb8n\" (UniqueName: \"kubernetes.io/projected/cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73-kube-api-access-vcb8n\") pod \"ironic-operator-controller-manager-5f4b8bd54d-xqpmp\" (UID: \"cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.366562 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knw94\" (UniqueName: \"kubernetes.io/projected/eea6da10-7c27-42c7-a532-f872f8b7c86a-kube-api-access-knw94\") pod \"nova-operator-controller-manager-55bff696bd-889p5\" (UID: \"eea6da10-7c27-42c7-a532-f872f8b7c86a\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:33:56 crc kubenswrapper[4834]: E0130 21:33:56.366925 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:56 crc kubenswrapper[4834]: E0130 21:33:56.366968 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert podName:b0b534e5-be84-4fd0-a8f6-ee233988095e nodeName:}" failed. No retries permitted until 2026-01-30 21:33:56.866956039 +0000 UTC m=+1088.020102177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert") pod "infra-operator-controller-manager-79955696d6-6rdsk" (UID: "b0b534e5-be84-4fd0-a8f6-ee233988095e") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.368448 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.377467 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.377937 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.393263 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcb8n\" (UniqueName: \"kubernetes.io/projected/cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73-kube-api-access-vcb8n\") pod \"ironic-operator-controller-manager-5f4b8bd54d-xqpmp\" (UID: \"cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.400065 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncvv\" (UniqueName: \"kubernetes.io/projected/ec5f427f-4be8-4066-a817-c9e2e3df4e6f-kube-api-access-wncvv\") pod \"keystone-operator-controller-manager-84f48565d4-n4kck\" (UID: \"ec5f427f-4be8-4066-a817-c9e2e3df4e6f\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.401018 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.420351 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbz6l\" (UniqueName: \"kubernetes.io/projected/b0b534e5-be84-4fd0-a8f6-ee233988095e-kube-api-access-jbz6l\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.422592 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.423700 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.435068 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tb856" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.439650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5sq\" (UniqueName: \"kubernetes.io/projected/d6aef56e-f8a3-4400-b28f-5bd40a323c73-kube-api-access-kp5sq\") pod \"manila-operator-controller-manager-7dd968899f-ccd87\" (UID: \"d6aef56e-f8a3-4400-b28f-5bd40a323c73\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.457360 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.468166 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sm22\" (UniqueName: \"kubernetes.io/projected/e22f9414-3441-42d6-adde-8629c168c055-kube-api-access-5sm22\") pod \"neutron-operator-controller-manager-585dbc889-lllql\" (UID: \"e22f9414-3441-42d6-adde-8629c168c055\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.468216 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq2vn\" (UniqueName: \"kubernetes.io/projected/b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6-kube-api-access-jq2vn\") pod \"octavia-operator-controller-manager-6687f8d877-xrvft\" (UID: \"b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.468265 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knw94\" (UniqueName: \"kubernetes.io/projected/eea6da10-7c27-42c7-a532-f872f8b7c86a-kube-api-access-knw94\") pod \"nova-operator-controller-manager-55bff696bd-889p5\" (UID: \"eea6da10-7c27-42c7-a532-f872f8b7c86a\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.468319 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8zjh\" (UniqueName: \"kubernetes.io/projected/d68f882f-c07c-4022-a6fa-f4814f313870-kube-api-access-q8zjh\") pod \"mariadb-operator-controller-manager-67bf948998-vc2wt\" (UID: \"d68f882f-c07c-4022-a6fa-f4814f313870\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.468773 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.469679 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.472663 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-s9rdx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.487451 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.488370 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.501745 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.501948 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dsv76" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.504101 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knw94\" (UniqueName: \"kubernetes.io/projected/eea6da10-7c27-42c7-a532-f872f8b7c86a-kube-api-access-knw94\") pod \"nova-operator-controller-manager-55bff696bd-889p5\" (UID: \"eea6da10-7c27-42c7-a532-f872f8b7c86a\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.516539 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.536415 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.557490 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8zjh\" (UniqueName: \"kubernetes.io/projected/d68f882f-c07c-4022-a6fa-f4814f313870-kube-api-access-q8zjh\") pod \"mariadb-operator-controller-manager-67bf948998-vc2wt\" (UID: \"d68f882f-c07c-4022-a6fa-f4814f313870\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.558189 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.569587 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.569644 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2hh\" (UniqueName: \"kubernetes.io/projected/4d4e0469-0167-410e-9bfd-26f81b9900cf-kube-api-access-pw2hh\") pod \"placement-operator-controller-manager-5b964cf4cd-45npx\" (UID: \"4d4e0469-0167-410e-9bfd-26f81b9900cf\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.569677 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5gg\" (UniqueName: \"kubernetes.io/projected/1ba55708-11fd-4a17-9a95-88fd28711fb6-kube-api-access-qf5gg\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.569717 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sm22\" (UniqueName: \"kubernetes.io/projected/e22f9414-3441-42d6-adde-8629c168c055-kube-api-access-5sm22\") pod \"neutron-operator-controller-manager-585dbc889-lllql\" (UID: \"e22f9414-3441-42d6-adde-8629c168c055\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.569740 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzsq\" (UniqueName: \"kubernetes.io/projected/1855b8a8-7a5e-4516-9bc6-156c6bb52068-kube-api-access-qlzsq\") pod \"ovn-operator-controller-manager-788c46999f-phxxg\" (UID: \"1855b8a8-7a5e-4516-9bc6-156c6bb52068\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.569759 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq2vn\" (UniqueName: \"kubernetes.io/projected/b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6-kube-api-access-jq2vn\") pod \"octavia-operator-controller-manager-6687f8d877-xrvft\" (UID: \"b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.607264 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq2vn\" (UniqueName: \"kubernetes.io/projected/b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6-kube-api-access-jq2vn\") pod \"octavia-operator-controller-manager-6687f8d877-xrvft\" (UID: \"b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.630339 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.631192 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.645010 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k68t2" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.646904 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sm22\" (UniqueName: \"kubernetes.io/projected/e22f9414-3441-42d6-adde-8629c168c055-kube-api-access-5sm22\") pod \"neutron-operator-controller-manager-585dbc889-lllql\" (UID: \"e22f9414-3441-42d6-adde-8629c168c055\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.647943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.662483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.673094 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.673223 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw2hh\" (UniqueName: \"kubernetes.io/projected/4d4e0469-0167-410e-9bfd-26f81b9900cf-kube-api-access-pw2hh\") pod \"placement-operator-controller-manager-5b964cf4cd-45npx\" (UID: \"4d4e0469-0167-410e-9bfd-26f81b9900cf\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.673264 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5gg\" (UniqueName: \"kubernetes.io/projected/1ba55708-11fd-4a17-9a95-88fd28711fb6-kube-api-access-qf5gg\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.673387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzsq\" (UniqueName: \"kubernetes.io/projected/1855b8a8-7a5e-4516-9bc6-156c6bb52068-kube-api-access-qlzsq\") pod \"ovn-operator-controller-manager-788c46999f-phxxg\" (UID: \"1855b8a8-7a5e-4516-9bc6-156c6bb52068\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:33:56 crc kubenswrapper[4834]: E0130 21:33:56.682012 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:33:56 crc kubenswrapper[4834]: E0130 21:33:56.682104 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert podName:1ba55708-11fd-4a17-9a95-88fd28711fb6 nodeName:}" failed. No retries permitted until 2026-01-30 21:33:57.182079954 +0000 UTC m=+1088.335226092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" (UID: "1ba55708-11fd-4a17-9a95-88fd28711fb6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.709431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzsq\" (UniqueName: \"kubernetes.io/projected/1855b8a8-7a5e-4516-9bc6-156c6bb52068-kube-api-access-qlzsq\") pod \"ovn-operator-controller-manager-788c46999f-phxxg\" (UID: \"1855b8a8-7a5e-4516-9bc6-156c6bb52068\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.716766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw2hh\" (UniqueName: \"kubernetes.io/projected/4d4e0469-0167-410e-9bfd-26f81b9900cf-kube-api-access-pw2hh\") pod \"placement-operator-controller-manager-5b964cf4cd-45npx\" (UID: \"4d4e0469-0167-410e-9bfd-26f81b9900cf\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.717063 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5gg\" (UniqueName: \"kubernetes.io/projected/1ba55708-11fd-4a17-9a95-88fd28711fb6-kube-api-access-qf5gg\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.717825 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.774333 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.776483 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgpdb\" (UniqueName: \"kubernetes.io/projected/2150b962-b815-4dec-ac4c-468aad4dc16c-kube-api-access-tgpdb\") pod \"swift-operator-controller-manager-68fc8c869-fgvhj\" (UID: \"2150b962-b815-4dec-ac4c-468aad4dc16c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.796311 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.808116 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.809379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.809954 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.812075 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-78pdx" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.819888 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.882117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgpdb\" (UniqueName: \"kubernetes.io/projected/2150b962-b815-4dec-ac4c-468aad4dc16c-kube-api-access-tgpdb\") pod \"swift-operator-controller-manager-68fc8c869-fgvhj\" (UID: \"2150b962-b815-4dec-ac4c-468aad4dc16c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.882534 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:56 crc kubenswrapper[4834]: E0130 21:33:56.884049 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:56 crc kubenswrapper[4834]: E0130 21:33:56.884092 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert podName:b0b534e5-be84-4fd0-a8f6-ee233988095e nodeName:}" failed. No retries permitted until 2026-01-30 21:33:57.884080174 +0000 UTC m=+1089.037226312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert") pod "infra-operator-controller-manager-79955696d6-6rdsk" (UID: "b0b534e5-be84-4fd0-a8f6-ee233988095e") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.911597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgpdb\" (UniqueName: \"kubernetes.io/projected/2150b962-b815-4dec-ac4c-468aad4dc16c-kube-api-access-tgpdb\") pod \"swift-operator-controller-manager-68fc8c869-fgvhj\" (UID: \"2150b962-b815-4dec-ac4c-468aad4dc16c\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.913590 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.915155 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.916271 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.925876 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ntrgt" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.935222 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.983761 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzqx\" (UniqueName: \"kubernetes.io/projected/9c4aa5d9-5f04-43a6-92d5-8258862556d2-kube-api-access-vpzqx\") pod \"telemetry-operator-controller-manager-6749767b8f-kk9tb\" (UID: \"9c4aa5d9-5f04-43a6-92d5-8258862556d2\") " pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.988305 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-zzf9h"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.989734 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.993139 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4z2rk" Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.996228 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-zzf9h"] Jan 30 21:33:56 crc kubenswrapper[4834]: I0130 21:33:56.997291 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.052480 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.053793 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.055760 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nk8lr" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.056481 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.056660 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.063626 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.086532 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x9vr\" (UniqueName: \"kubernetes.io/projected/570acfca-9a4a-403d-a421-b339b31def95-kube-api-access-5x9vr\") pod \"watcher-operator-controller-manager-564965969-zzf9h\" (UID: \"570acfca-9a4a-403d-a421-b339b31def95\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.086591 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25hj\" (UniqueName: \"kubernetes.io/projected/9306fcee-b55f-488a-bc53-1a809c9f20e9-kube-api-access-m25hj\") pod \"test-operator-controller-manager-56f8bfcd9f-x4j85\" (UID: \"9306fcee-b55f-488a-bc53-1a809c9f20e9\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.086615 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzqx\" (UniqueName: \"kubernetes.io/projected/9c4aa5d9-5f04-43a6-92d5-8258862556d2-kube-api-access-vpzqx\") pod \"telemetry-operator-controller-manager-6749767b8f-kk9tb\" (UID: \"9c4aa5d9-5f04-43a6-92d5-8258862556d2\") " pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.086729 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.102900 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.110934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzqx\" (UniqueName: \"kubernetes.io/projected/9c4aa5d9-5f04-43a6-92d5-8258862556d2-kube-api-access-vpzqx\") pod \"telemetry-operator-controller-manager-6749767b8f-kk9tb\" (UID: \"9c4aa5d9-5f04-43a6-92d5-8258862556d2\") " pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.114528 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.115814 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.119208 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7cx9k" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.151064 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.153663 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.186062 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.189353 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4c8\" (UniqueName: \"kubernetes.io/projected/a9897e6e-a451-4b52-9135-dca4af64fbfb-kube-api-access-2n4c8\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.189429 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x9vr\" (UniqueName: \"kubernetes.io/projected/570acfca-9a4a-403d-a421-b339b31def95-kube-api-access-5x9vr\") pod \"watcher-operator-controller-manager-564965969-zzf9h\" (UID: \"570acfca-9a4a-403d-a421-b339b31def95\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.189464 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.189487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25hj\" (UniqueName: \"kubernetes.io/projected/9306fcee-b55f-488a-bc53-1a809c9f20e9-kube-api-access-m25hj\") pod \"test-operator-controller-manager-56f8bfcd9f-x4j85\" (UID: \"9306fcee-b55f-488a-bc53-1a809c9f20e9\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.189531 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.189598 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.189711 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.189759 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert podName:1ba55708-11fd-4a17-9a95-88fd28711fb6 nodeName:}" failed. No retries permitted until 2026-01-30 21:33:58.189746182 +0000 UTC m=+1089.342892320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" (UID: "1ba55708-11fd-4a17-9a95-88fd28711fb6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.204374 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.209616 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x9vr\" (UniqueName: \"kubernetes.io/projected/570acfca-9a4a-403d-a421-b339b31def95-kube-api-access-5x9vr\") pod \"watcher-operator-controller-manager-564965969-zzf9h\" (UID: \"570acfca-9a4a-403d-a421-b339b31def95\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.209613 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25hj\" (UniqueName: \"kubernetes.io/projected/9306fcee-b55f-488a-bc53-1a809c9f20e9-kube-api-access-m25hj\") pod \"test-operator-controller-manager-56f8bfcd9f-x4j85\" (UID: \"9306fcee-b55f-488a-bc53-1a809c9f20e9\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.211820 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.281639 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.293238 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4c8\" (UniqueName: \"kubernetes.io/projected/a9897e6e-a451-4b52-9135-dca4af64fbfb-kube-api-access-2n4c8\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.293301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmvk\" (UniqueName: \"kubernetes.io/projected/b998e309-c037-436c-aed4-12298af019ac-kube-api-access-dwmvk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gxzqj\" (UID: \"b998e309-c037-436c-aed4-12298af019ac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.293328 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.293368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.293517 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.293517 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.293558 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:33:57.793543626 +0000 UTC m=+1088.946689764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "metrics-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.293587 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:33:57.793568196 +0000 UTC m=+1088.946714334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.311119 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4c8\" (UniqueName: \"kubernetes.io/projected/a9897e6e-a451-4b52-9135-dca4af64fbfb-kube-api-access-2n4c8\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.318996 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.333767 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.381256 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" event={"ID":"9c50c180-8d91-43d0-bb07-4ea3881a1751","Type":"ContainerStarted","Data":"159cdb5706165f7d13a6846a68e5184ab8a9ce41e6974fb66cf2a0b0a3c18984"} Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.383909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" event={"ID":"6371c2f9-d19b-4b87-b0db-ba05d48ea5fb","Type":"ContainerStarted","Data":"2aa2ba3cd29d923d854dc3c5d00633cd0a0f29345d5cd0239a2a98f0925240a3"} Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.384799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" event={"ID":"2156cb3c-172b-4268-86e6-64b1d40b87ed","Type":"ContainerStarted","Data":"32d54086c8c00fe94a0c1bb3fd67e5ebebca9cc399a5afa3f1f90f9c61f29cd8"} Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.390498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" event={"ID":"2e538c79-bca6-46f0-a63d-fb537639f206","Type":"ContainerStarted","Data":"2af709977db18feec7df3913f210c5c8ccc42bd2c6effc4253fce8d2a8c1d191"} Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.396254 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmvk\" (UniqueName: \"kubernetes.io/projected/b998e309-c037-436c-aed4-12298af019ac-kube-api-access-dwmvk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gxzqj\" (UID: \"b998e309-c037-436c-aed4-12298af019ac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.403256 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.419531 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmvk\" (UniqueName: \"kubernetes.io/projected/b998e309-c037-436c-aed4-12298af019ac-kube-api-access-dwmvk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gxzqj\" (UID: \"b998e309-c037-436c-aed4-12298af019ac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.433476 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7"] Jan 30 21:33:57 crc kubenswrapper[4834]: W0130 21:33:57.472189 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6aef56e_f8a3_4400_b28f_5bd40a323c73.slice/crio-8dbd1847806e9d1960f6f99198e435cfde4d82e6013246385a149995c947df37 WatchSource:0}: Error finding container 8dbd1847806e9d1960f6f99198e435cfde4d82e6013246385a149995c947df37: Status 404 returned error can't find the container with id 8dbd1847806e9d1960f6f99198e435cfde4d82e6013246385a149995c947df37 Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.611775 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.618366 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-889p5"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.643650 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp"] Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.651629 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt"] Jan 30 21:33:57 crc kubenswrapper[4834]: W0130 21:33:57.657562 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd68f882f_c07c_4022_a6fa_f4814f313870.slice/crio-ff6700cbc7a21170844ce909649befd7452fc3854ed551bb6f08befd32ae360e WatchSource:0}: Error finding container ff6700cbc7a21170844ce909649befd7452fc3854ed551bb6f08befd32ae360e: Status 404 returned error can't find the container with id ff6700cbc7a21170844ce909649befd7452fc3854ed551bb6f08befd32ae360e Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.674604 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.790209 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck"] Jan 30 21:33:57 crc kubenswrapper[4834]: W0130 21:33:57.800179 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5f427f_4be8_4066_a817_c9e2e3df4e6f.slice/crio-6033251b6d0d3a61dd080296843819694ab06ef56e6e0e89fc1336998a1bec28 WatchSource:0}: Error finding container 6033251b6d0d3a61dd080296843819694ab06ef56e6e0e89fc1336998a1bec28: Status 404 returned error can't find the container with id 6033251b6d0d3a61dd080296843819694ab06ef56e6e0e89fc1336998a1bec28 Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.802463 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.802522 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.802642 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.802698 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:33:58.802678266 +0000 UTC m=+1089.955824514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "metrics-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.803061 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.803098 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:33:58.803086848 +0000 UTC m=+1089.956233066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.906973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.907679 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: E0130 21:33:57.907726 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert podName:b0b534e5-be84-4fd0-a8f6-ee233988095e nodeName:}" failed. No retries permitted until 2026-01-30 21:33:59.907712714 +0000 UTC m=+1091.060858842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert") pod "infra-operator-controller-manager-79955696d6-6rdsk" (UID: "b0b534e5-be84-4fd0-a8f6-ee233988095e") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:57 crc kubenswrapper[4834]: I0130 21:33:57.962014 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj"] Jan 30 21:33:57 crc kubenswrapper[4834]: W0130 21:33:57.971642 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb998e309_c037_436c_aed4_12298af019ac.slice/crio-08499169a196a93216b3cc6ff4ee0c562564c17de3fd2cfbef7669d37054d9d0 WatchSource:0}: Error finding container 08499169a196a93216b3cc6ff4ee0c562564c17de3fd2cfbef7669d37054d9d0: Status 404 returned error can't find the container with id 08499169a196a93216b3cc6ff4ee0c562564c17de3fd2cfbef7669d37054d9d0 Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.100501 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-lllql"] Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.113577 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx"] Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.130403 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj"] Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.136513 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5sm22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-lllql_openstack-operators(e22f9414-3441-42d6-adde-8629c168c055): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.137744 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" podUID="e22f9414-3441-42d6-adde-8629c168c055" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.139889 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb"] Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.151207 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg"] Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.152158 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tgpdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-fgvhj_openstack-operators(2150b962-b815-4dec-ac4c-468aad4dc16c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.153422 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" podUID="2150b962-b815-4dec-ac4c-468aad4dc16c" Jan 30 21:33:58 crc kubenswrapper[4834]: W0130 21:33:58.162660 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4aa5d9_5f04_43a6_92d5_8258862556d2.slice/crio-042b2bfb17fa6cc1aa853498d900bd12d11f02b6daadcf6b700a4b083aad54cb WatchSource:0}: Error finding container 042b2bfb17fa6cc1aa853498d900bd12d11f02b6daadcf6b700a4b083aad54cb: Status 404 returned error can't find the container with id 042b2bfb17fa6cc1aa853498d900bd12d11f02b6daadcf6b700a4b083aad54cb Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.164594 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85"] Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.167228 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.103:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vpzqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6749767b8f-kk9tb_openstack-operators(9c4aa5d9-5f04-43a6-92d5-8258862556d2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.168301 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" podUID="9c4aa5d9-5f04-43a6-92d5-8258862556d2" Jan 30 21:33:58 crc kubenswrapper[4834]: W0130 21:33:58.172753 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod570acfca_9a4a_403d_a421_b339b31def95.slice/crio-39ec48958462d4bbaea76ed576b59df34e99363bf8edf27331d551c4a8744f62 WatchSource:0}: Error finding container 39ec48958462d4bbaea76ed576b59df34e99363bf8edf27331d551c4a8744f62: Status 404 returned error can't find the container with id 39ec48958462d4bbaea76ed576b59df34e99363bf8edf27331d551c4a8744f62 Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.175553 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5x9vr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-zzf9h_openstack-operators(570acfca-9a4a-403d-a421-b339b31def95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.176658 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" podUID="570acfca-9a4a-403d-a421-b339b31def95" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.177354 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-zzf9h"] Jan 30 21:33:58 crc kubenswrapper[4834]: W0130 21:33:58.188268 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9306fcee_b55f_488a_bc53_1a809c9f20e9.slice/crio-75df4dc72164469bfeebf140a41434513aeaf3c6b67c0709b6a7331e4de3e6cc WatchSource:0}: Error finding container 75df4dc72164469bfeebf140a41434513aeaf3c6b67c0709b6a7331e4de3e6cc: Status 404 returned error can't find the container with id 75df4dc72164469bfeebf140a41434513aeaf3c6b67c0709b6a7331e4de3e6cc Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.192293 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m25hj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-x4j85_openstack-operators(9306fcee-b55f-488a-bc53-1a809c9f20e9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.193507 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" podUID="9306fcee-b55f-488a-bc53-1a809c9f20e9" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.214738 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.215032 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.215163 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert podName:1ba55708-11fd-4a17-9a95-88fd28711fb6 nodeName:}" failed. No retries permitted until 2026-01-30 21:34:00.215134083 +0000 UTC m=+1091.368280221 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" (UID: "1ba55708-11fd-4a17-9a95-88fd28711fb6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.403417 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" event={"ID":"afece73a-c2b2-4905-819f-e8c73d968968","Type":"ContainerStarted","Data":"f254fd69f42f2e2c647e8f3cc8fc2336ca53d9e0c8fde22ec1f9a85d04c55bc4"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.405133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" event={"ID":"b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6","Type":"ContainerStarted","Data":"35dc50df4f4c28945f65648b97c64d9b47467f4c2871f3d2cc3a963deb05890d"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.407111 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" event={"ID":"d6aef56e-f8a3-4400-b28f-5bd40a323c73","Type":"ContainerStarted","Data":"8dbd1847806e9d1960f6f99198e435cfde4d82e6013246385a149995c947df37"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.408624 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" event={"ID":"88accce3-ac33-420c-aa10-b7fea0b498c3","Type":"ContainerStarted","Data":"150b1598577d9c08e1fe8cdb934442e0b265f694c45d80951c7b5f60035b547f"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.410620 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" event={"ID":"570acfca-9a4a-403d-a421-b339b31def95","Type":"ContainerStarted","Data":"39ec48958462d4bbaea76ed576b59df34e99363bf8edf27331d551c4a8744f62"} Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.413067 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" podUID="570acfca-9a4a-403d-a421-b339b31def95" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.413972 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" event={"ID":"9306fcee-b55f-488a-bc53-1a809c9f20e9","Type":"ContainerStarted","Data":"75df4dc72164469bfeebf140a41434513aeaf3c6b67c0709b6a7331e4de3e6cc"} Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.415990 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" podUID="9306fcee-b55f-488a-bc53-1a809c9f20e9" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.417751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" event={"ID":"eea6da10-7c27-42c7-a532-f872f8b7c86a","Type":"ContainerStarted","Data":"55b706df775170873250003f65ec12d8f26ad3253e2007c2b3cea88893942a6b"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.419227 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" event={"ID":"cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73","Type":"ContainerStarted","Data":"527f65618dd9ac3600843406188740737dd84832ed2d1cdb54bb27324d1675c6"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.422723 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" event={"ID":"ec5f427f-4be8-4066-a817-c9e2e3df4e6f","Type":"ContainerStarted","Data":"6033251b6d0d3a61dd080296843819694ab06ef56e6e0e89fc1336998a1bec28"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.429254 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" event={"ID":"b998e309-c037-436c-aed4-12298af019ac","Type":"ContainerStarted","Data":"08499169a196a93216b3cc6ff4ee0c562564c17de3fd2cfbef7669d37054d9d0"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.433633 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" event={"ID":"2150b962-b815-4dec-ac4c-468aad4dc16c","Type":"ContainerStarted","Data":"ce9dc53317eb9eb3742d0fc669ca89f5b4af5856e9a9650291a5c72d621457e3"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.435080 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" event={"ID":"1855b8a8-7a5e-4516-9bc6-156c6bb52068","Type":"ContainerStarted","Data":"42e2825f858e387d39840688ca9dd3e8d7b780e97fc0fe207ef4b80289c30a0b"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.437496 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" event={"ID":"9c4aa5d9-5f04-43a6-92d5-8258862556d2","Type":"ContainerStarted","Data":"042b2bfb17fa6cc1aa853498d900bd12d11f02b6daadcf6b700a4b083aad54cb"} Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.444709 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.103:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" podUID="9c4aa5d9-5f04-43a6-92d5-8258862556d2" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.445438 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" event={"ID":"d68f882f-c07c-4022-a6fa-f4814f313870","Type":"ContainerStarted","Data":"ff6700cbc7a21170844ce909649befd7452fc3854ed551bb6f08befd32ae360e"} Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.447639 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" podUID="2150b962-b815-4dec-ac4c-468aad4dc16c" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.448735 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" event={"ID":"4d4e0469-0167-410e-9bfd-26f81b9900cf","Type":"ContainerStarted","Data":"03d365cab148603cc02e35a5064c1b7851c96e47a0ca42f076580cf310e7277a"} Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.453335 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" event={"ID":"e22f9414-3441-42d6-adde-8629c168c055","Type":"ContainerStarted","Data":"a6c1fb9dbbff01b0a1a935e78b56f010256248ddf3ae898a2a1aaeb61c7b3878"} Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.456196 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" podUID="e22f9414-3441-42d6-adde-8629c168c055" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.826317 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:58 crc kubenswrapper[4834]: I0130 21:33:58.826402 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.826545 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.826601 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.826655 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:00.826627256 +0000 UTC m=+1091.979773394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "webhook-server-cert" not found Jan 30 21:33:58 crc kubenswrapper[4834]: E0130 21:33:58.826680 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:00.826663817 +0000 UTC m=+1091.979809955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "metrics-server-cert" not found Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.490789 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" podUID="570acfca-9a4a-403d-a421-b339b31def95" Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.492493 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" podUID="2150b962-b815-4dec-ac4c-468aad4dc16c" Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.492655 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.103:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" podUID="9c4aa5d9-5f04-43a6-92d5-8258862556d2" Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.492742 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" podUID="e22f9414-3441-42d6-adde-8629c168c055" Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.493719 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" podUID="9306fcee-b55f-488a-bc53-1a809c9f20e9" Jan 30 21:33:59 crc kubenswrapper[4834]: I0130 21:33:59.946816 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.946994 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:33:59 crc kubenswrapper[4834]: E0130 21:33:59.947049 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert podName:b0b534e5-be84-4fd0-a8f6-ee233988095e nodeName:}" failed. No retries permitted until 2026-01-30 21:34:03.947029534 +0000 UTC m=+1095.100175672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert") pod "infra-operator-controller-manager-79955696d6-6rdsk" (UID: "b0b534e5-be84-4fd0-a8f6-ee233988095e") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:34:00 crc kubenswrapper[4834]: I0130 21:34:00.249814 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:34:00 crc kubenswrapper[4834]: E0130 21:34:00.249981 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:34:00 crc kubenswrapper[4834]: E0130 21:34:00.250043 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert podName:1ba55708-11fd-4a17-9a95-88fd28711fb6 nodeName:}" failed. No retries permitted until 2026-01-30 21:34:04.250024539 +0000 UTC m=+1095.403170677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" (UID: "1ba55708-11fd-4a17-9a95-88fd28711fb6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:34:00 crc kubenswrapper[4834]: I0130 21:34:00.860202 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:00 crc kubenswrapper[4834]: I0130 21:34:00.860653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:00 crc kubenswrapper[4834]: E0130 21:34:00.860418 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:34:00 crc kubenswrapper[4834]: E0130 21:34:00.860741 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:04.860723669 +0000 UTC m=+1096.013869807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "webhook-server-cert" not found Jan 30 21:34:00 crc kubenswrapper[4834]: E0130 21:34:00.860836 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:34:00 crc kubenswrapper[4834]: E0130 21:34:00.860908 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:04.860887633 +0000 UTC m=+1096.014033881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "metrics-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: I0130 21:34:04.016831 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.017108 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.017226 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert podName:b0b534e5-be84-4fd0-a8f6-ee233988095e nodeName:}" failed. No retries permitted until 2026-01-30 21:34:12.017199335 +0000 UTC m=+1103.170345513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert") pod "infra-operator-controller-manager-79955696d6-6rdsk" (UID: "b0b534e5-be84-4fd0-a8f6-ee233988095e") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: I0130 21:34:04.161271 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:34:04 crc kubenswrapper[4834]: I0130 21:34:04.161346 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:34:04 crc kubenswrapper[4834]: I0130 21:34:04.321507 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.321904 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.322108 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert podName:1ba55708-11fd-4a17-9a95-88fd28711fb6 nodeName:}" failed. No retries permitted until 2026-01-30 21:34:12.32202658 +0000 UTC m=+1103.475172758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" (UID: "1ba55708-11fd-4a17-9a95-88fd28711fb6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: I0130 21:34:04.931180 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:04 crc kubenswrapper[4834]: I0130 21:34:04.931345 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.931355 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.931476 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:12.931456746 +0000 UTC m=+1104.084602874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "metrics-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.931594 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:34:04 crc kubenswrapper[4834]: E0130 21:34:04.931691 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:12.931668102 +0000 UTC m=+1104.084814330 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "webhook-server-cert" not found Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.007518 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898" Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.007961 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sd7q5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d874c8fc-4crlf_openstack-operators(6371c2f9-d19b-4b87-b0db-ba05d48ea5fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.009145 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" podUID="6371c2f9-d19b-4b87-b0db-ba05d48ea5fb" Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.562043 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" podUID="6371c2f9-d19b-4b87-b0db-ba05d48ea5fb" Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.764641 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.764803 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kp5sq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-ccd87_openstack-operators(d6aef56e-f8a3-4400-b28f-5bd40a323c73): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:09 crc kubenswrapper[4834]: E0130 21:34:09.766033 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" podUID="d6aef56e-f8a3-4400-b28f-5bd40a323c73" Jan 30 21:34:10 crc kubenswrapper[4834]: E0130 21:34:10.573113 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" podUID="d6aef56e-f8a3-4400-b28f-5bd40a323c73" Jan 30 21:34:10 crc kubenswrapper[4834]: E0130 21:34:10.760031 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Jan 30 21:34:10 crc kubenswrapper[4834]: E0130 21:34:10.760287 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q8zjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-vc2wt_openstack-operators(d68f882f-c07c-4022-a6fa-f4814f313870): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:10 crc kubenswrapper[4834]: E0130 21:34:10.761552 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" podUID="d68f882f-c07c-4022-a6fa-f4814f313870" Jan 30 21:34:11 crc kubenswrapper[4834]: E0130 21:34:11.669319 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" podUID="d68f882f-c07c-4022-a6fa-f4814f313870" Jan 30 21:34:11 crc kubenswrapper[4834]: E0130 21:34:11.740637 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382" Jan 30 21:34:11 crc kubenswrapper[4834]: E0130 21:34:11.741298 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6qvpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-6d9697b7f4-hjdtw_openstack-operators(2e538c79-bca6-46f0-a63d-fb537639f206): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:11 crc kubenswrapper[4834]: E0130 21:34:11.744674 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" podUID="2e538c79-bca6-46f0-a63d-fb537639f206" Jan 30 21:34:12 crc kubenswrapper[4834]: I0130 21:34:12.053802 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.054000 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.054431 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert podName:b0b534e5-be84-4fd0-a8f6-ee233988095e nodeName:}" failed. No retries permitted until 2026-01-30 21:34:28.05441002 +0000 UTC m=+1119.207556168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert") pod "infra-operator-controller-manager-79955696d6-6rdsk" (UID: "b0b534e5-be84-4fd0-a8f6-ee233988095e") : secret "infra-operator-webhook-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: I0130 21:34:12.359856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.360031 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.360095 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert podName:1ba55708-11fd-4a17-9a95-88fd28711fb6 nodeName:}" failed. No retries permitted until 2026-01-30 21:34:28.360081141 +0000 UTC m=+1119.513227279 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" (UID: "1ba55708-11fd-4a17-9a95-88fd28711fb6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.592590 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:d9f6f8dc6a6dd9b0d7c96e4c89b3056291fd61f11126a1304256a4d6cacd0382\\\"\"" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" podUID="2e538c79-bca6-46f0-a63d-fb537639f206" Jan 30 21:34:12 crc kubenswrapper[4834]: I0130 21:34:12.982649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:12 crc kubenswrapper[4834]: I0130 21:34:12.982741 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.982899 4834 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.982956 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:28.982938066 +0000 UTC m=+1120.136084214 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "metrics-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.983367 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 30 21:34:12 crc kubenswrapper[4834]: E0130 21:34:12.983422 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs podName:a9897e6e-a451-4b52-9135-dca4af64fbfb nodeName:}" failed. No retries permitted until 2026-01-30 21:34:28.9834109 +0000 UTC m=+1120.136557048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs") pod "openstack-operator-controller-manager-7d48698d88-q72fc" (UID: "a9897e6e-a451-4b52-9135-dca4af64fbfb") : secret "webhook-server-cert" not found Jan 30 21:34:15 crc kubenswrapper[4834]: E0130 21:34:15.846800 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488" Jan 30 21:34:15 crc kubenswrapper[4834]: E0130 21:34:15.847203 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pw2hh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-45npx_openstack-operators(4d4e0469-0167-410e-9bfd-26f81b9900cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:15 crc kubenswrapper[4834]: E0130 21:34:15.848616 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" podUID="4d4e0469-0167-410e-9bfd-26f81b9900cf" Jan 30 21:34:16 crc kubenswrapper[4834]: E0130 21:34:16.630228 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" podUID="4d4e0469-0167-410e-9bfd-26f81b9900cf" Jan 30 21:34:16 crc kubenswrapper[4834]: E0130 21:34:16.917238 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Jan 30 21:34:16 crc kubenswrapper[4834]: E0130 21:34:16.917455 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knw94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-889p5_openstack-operators(eea6da10-7c27-42c7-a532-f872f8b7c86a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:16 crc kubenswrapper[4834]: E0130 21:34:16.918622 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" podUID="eea6da10-7c27-42c7-a532-f872f8b7c86a" Jan 30 21:34:17 crc kubenswrapper[4834]: E0130 21:34:17.641675 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" podUID="eea6da10-7c27-42c7-a532-f872f8b7c86a" Jan 30 21:34:17 crc kubenswrapper[4834]: E0130 21:34:17.648189 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Jan 30 21:34:17 crc kubenswrapper[4834]: E0130 21:34:17.648344 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wncvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-n4kck_openstack-operators(ec5f427f-4be8-4066-a817-c9e2e3df4e6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:17 crc kubenswrapper[4834]: E0130 21:34:17.649559 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" podUID="ec5f427f-4be8-4066-a817-c9e2e3df4e6f" Jan 30 21:34:18 crc kubenswrapper[4834]: E0130 21:34:18.168214 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 30 21:34:18 crc kubenswrapper[4834]: E0130 21:34:18.168363 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwmvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gxzqj_openstack-operators(b998e309-c037-436c-aed4-12298af019ac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:18 crc kubenswrapper[4834]: E0130 21:34:18.169522 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" podUID="b998e309-c037-436c-aed4-12298af019ac" Jan 30 21:34:18 crc kubenswrapper[4834]: E0130 21:34:18.643552 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" podUID="ec5f427f-4be8-4066-a817-c9e2e3df4e6f" Jan 30 21:34:18 crc kubenswrapper[4834]: E0130 21:34:18.644706 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" podUID="b998e309-c037-436c-aed4-12298af019ac" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.683268 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" event={"ID":"9c50c180-8d91-43d0-bb07-4ea3881a1751","Type":"ContainerStarted","Data":"e3fb35c3aa2eee22130b5d497d09b3ffeb4351910eef63e4d127a1a20390cec9"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.683965 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.684612 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" event={"ID":"b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6","Type":"ContainerStarted","Data":"05e4ca23bfed70bef54ebf4a1867b150cdcd2679a82e2320b3db26117ef448b6"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.684945 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.686571 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" event={"ID":"6371c2f9-d19b-4b87-b0db-ba05d48ea5fb","Type":"ContainerStarted","Data":"a24422e2c72e7c51763b2ec68c148a432b4f77fc8b1689092ffb5b7d402e0dc7"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.686818 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.688013 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" event={"ID":"2150b962-b815-4dec-ac4c-468aad4dc16c","Type":"ContainerStarted","Data":"124fc3b4f55c1524fae8472d69f7abf8dbe12e01f06597bbfc90037367e63975"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.688261 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.689517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" event={"ID":"9c4aa5d9-5f04-43a6-92d5-8258862556d2","Type":"ContainerStarted","Data":"d0d6c98e546386a110e44f9e99a575840380a3dfbd1f7a491a739a9bbb839039"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.689624 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.690987 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" event={"ID":"2156cb3c-172b-4268-86e6-64b1d40b87ed","Type":"ContainerStarted","Data":"a277c80a4892562c27a2a5980366084a6428b9ba4f205b9564564cfdb959e587"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.691291 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.692700 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" event={"ID":"1855b8a8-7a5e-4516-9bc6-156c6bb52068","Type":"ContainerStarted","Data":"a35f6aae50ebfb35a1891755dbdb641761b38730e2cf1e278f7beb8b6868129c"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.693517 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.694764 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" event={"ID":"afece73a-c2b2-4905-819f-e8c73d968968","Type":"ContainerStarted","Data":"66262bb848a512574beba48a39e2d763b166d37d4c2bcb501f2bd227f12a7b3a"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.695234 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.696872 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" event={"ID":"cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73","Type":"ContainerStarted","Data":"1ad4f5c4787fc56e860c0fbfabbad7cd7059e1a56dc665a9903889cdd1022f77"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.697007 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.698437 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" event={"ID":"88accce3-ac33-420c-aa10-b7fea0b498c3","Type":"ContainerStarted","Data":"945137495f9205c596c03aafe6623631f368dcb222f361c689f08b4ab7affd71"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.698492 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.699880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" event={"ID":"570acfca-9a4a-403d-a421-b339b31def95","Type":"ContainerStarted","Data":"941cf51fad5d51339a1191ad781ac1c2ee9aa668011536e3322b3b690dfb9745"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.700233 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.701301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" event={"ID":"9306fcee-b55f-488a-bc53-1a809c9f20e9","Type":"ContainerStarted","Data":"a9bb0a8e401e149be7c8e16b0e270ef2646ad519a38ab0ffe985c2f1cf919671"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.701669 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.703052 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" event={"ID":"e22f9414-3441-42d6-adde-8629c168c055","Type":"ContainerStarted","Data":"5bb327f164a8560efd12545d7154b5fa1c9ad3c2f4eb06ba6aca04497c1ad907"} Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.703507 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.786302 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" podStartSLOduration=3.504327007 podStartE2EDuration="27.786285951s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.192035065 +0000 UTC m=+1089.345181203" lastFinishedPulling="2026-01-30 21:34:22.473993999 +0000 UTC m=+1113.627140147" observedRunningTime="2026-01-30 21:34:23.778694067 +0000 UTC m=+1114.931840205" watchObservedRunningTime="2026-01-30 21:34:23.786285951 +0000 UTC m=+1114.939432089" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.788009 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" podStartSLOduration=7.292312088 podStartE2EDuration="28.788002489s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.20427801 +0000 UTC m=+1088.357424148" lastFinishedPulling="2026-01-30 21:34:18.699968401 +0000 UTC m=+1109.853114549" observedRunningTime="2026-01-30 21:34:23.727273722 +0000 UTC m=+1114.880419860" watchObservedRunningTime="2026-01-30 21:34:23.788002489 +0000 UTC m=+1114.941148627" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.806250 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" podStartSLOduration=2.306072494 podStartE2EDuration="28.806229891s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:56.863852356 +0000 UTC m=+1088.016998494" lastFinishedPulling="2026-01-30 21:34:23.364009753 +0000 UTC m=+1114.517155891" observedRunningTime="2026-01-30 21:34:23.804778891 +0000 UTC m=+1114.957925029" watchObservedRunningTime="2026-01-30 21:34:23.806229891 +0000 UTC m=+1114.959376029" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.859549 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" podStartSLOduration=3.6265724820000003 podStartE2EDuration="27.859533959s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.175413438 +0000 UTC m=+1089.328559576" lastFinishedPulling="2026-01-30 21:34:22.408374885 +0000 UTC m=+1113.561521053" observedRunningTime="2026-01-30 21:34:23.838164549 +0000 UTC m=+1114.991310687" watchObservedRunningTime="2026-01-30 21:34:23.859533959 +0000 UTC m=+1115.012680097" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.906080 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" podStartSLOduration=8.678537207 podStartE2EDuration="27.906064907s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.625699789 +0000 UTC m=+1088.778845927" lastFinishedPulling="2026-01-30 21:34:16.853227489 +0000 UTC m=+1108.006373627" observedRunningTime="2026-01-30 21:34:23.894204414 +0000 UTC m=+1115.047350552" watchObservedRunningTime="2026-01-30 21:34:23.906064907 +0000 UTC m=+1115.059211045" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.932679 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" podStartSLOduration=7.608686594 podStartE2EDuration="28.932663835s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.375063034 +0000 UTC m=+1088.528209172" lastFinishedPulling="2026-01-30 21:34:18.699040275 +0000 UTC m=+1109.852186413" observedRunningTime="2026-01-30 21:34:23.928164078 +0000 UTC m=+1115.081310216" watchObservedRunningTime="2026-01-30 21:34:23.932663835 +0000 UTC m=+1115.085809973" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.958702 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" podStartSLOduration=3.686154979 podStartE2EDuration="27.958685156s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.135554279 +0000 UTC m=+1089.288700417" lastFinishedPulling="2026-01-30 21:34:22.408084446 +0000 UTC m=+1113.561230594" observedRunningTime="2026-01-30 21:34:23.956158335 +0000 UTC m=+1115.109304473" watchObservedRunningTime="2026-01-30 21:34:23.958685156 +0000 UTC m=+1115.111831294" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.978597 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" podStartSLOduration=3.724804985 podStartE2EDuration="27.978583075s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.151919299 +0000 UTC m=+1089.305065437" lastFinishedPulling="2026-01-30 21:34:22.405697349 +0000 UTC m=+1113.558843527" observedRunningTime="2026-01-30 21:34:23.974550532 +0000 UTC m=+1115.127696670" watchObservedRunningTime="2026-01-30 21:34:23.978583075 +0000 UTC m=+1115.131729213" Jan 30 21:34:23 crc kubenswrapper[4834]: I0130 21:34:23.989152 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" podStartSLOduration=7.739978861 podStartE2EDuration="28.989138972s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.449438311 +0000 UTC m=+1088.602584449" lastFinishedPulling="2026-01-30 21:34:18.698598422 +0000 UTC m=+1109.851744560" observedRunningTime="2026-01-30 21:34:23.986965311 +0000 UTC m=+1115.140111449" watchObservedRunningTime="2026-01-30 21:34:23.989138972 +0000 UTC m=+1115.142285110" Jan 30 21:34:24 crc kubenswrapper[4834]: I0130 21:34:24.049351 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" podStartSLOduration=8.007929864 podStartE2EDuration="29.049325033s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.657587334 +0000 UTC m=+1088.810733472" lastFinishedPulling="2026-01-30 21:34:18.698982493 +0000 UTC m=+1109.852128641" observedRunningTime="2026-01-30 21:34:24.028853768 +0000 UTC m=+1115.181999906" watchObservedRunningTime="2026-01-30 21:34:24.049325033 +0000 UTC m=+1115.202471171" Jan 30 21:34:24 crc kubenswrapper[4834]: I0130 21:34:24.053682 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" podStartSLOduration=7.487649359 podStartE2EDuration="28.053673426s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.132837923 +0000 UTC m=+1089.285984051" lastFinishedPulling="2026-01-30 21:34:18.69886198 +0000 UTC m=+1109.852008118" observedRunningTime="2026-01-30 21:34:24.05277502 +0000 UTC m=+1115.205921158" watchObservedRunningTime="2026-01-30 21:34:24.053673426 +0000 UTC m=+1115.206819564" Jan 30 21:34:24 crc kubenswrapper[4834]: I0130 21:34:24.074948 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" podStartSLOduration=3.362936983 podStartE2EDuration="28.074934033s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.167074634 +0000 UTC m=+1089.320220772" lastFinishedPulling="2026-01-30 21:34:22.879071684 +0000 UTC m=+1114.032217822" observedRunningTime="2026-01-30 21:34:24.071309641 +0000 UTC m=+1115.224455779" watchObservedRunningTime="2026-01-30 21:34:24.074934033 +0000 UTC m=+1115.228080171" Jan 30 21:34:24 crc kubenswrapper[4834]: I0130 21:34:24.090512 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" podStartSLOduration=7.054841992 podStartE2EDuration="29.090494741s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.283471013 +0000 UTC m=+1088.436617151" lastFinishedPulling="2026-01-30 21:34:19.319123762 +0000 UTC m=+1110.472269900" observedRunningTime="2026-01-30 21:34:24.089668927 +0000 UTC m=+1115.242815075" watchObservedRunningTime="2026-01-30 21:34:24.090494741 +0000 UTC m=+1115.243640879" Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.717432 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" event={"ID":"d6aef56e-f8a3-4400-b28f-5bd40a323c73","Type":"ContainerStarted","Data":"c77c93f46c8e3ecac3da3ecfde082b3936214c91a1788d1b9a1fcebd283a4af2"} Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.717995 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.719826 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" event={"ID":"d68f882f-c07c-4022-a6fa-f4814f313870","Type":"ContainerStarted","Data":"b2a0ac1f0492f3654804bdeb06cc4bb71d156375d49b67b2d38b6a86e9d8021a"} Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.720017 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.721290 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" event={"ID":"2e538c79-bca6-46f0-a63d-fb537639f206","Type":"ContainerStarted","Data":"52e343da573f7c36acd4417aabcfb2bc791383f0b59e33371c06f512d87ebfad"} Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.744606 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" podStartSLOduration=2.8573020529999997 podStartE2EDuration="30.744580418s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.480845313 +0000 UTC m=+1088.633991451" lastFinishedPulling="2026-01-30 21:34:25.368123688 +0000 UTC m=+1116.521269816" observedRunningTime="2026-01-30 21:34:25.732380205 +0000 UTC m=+1116.885526373" watchObservedRunningTime="2026-01-30 21:34:25.744580418 +0000 UTC m=+1116.897726586" Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.764074 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" podStartSLOduration=3.580582825 podStartE2EDuration="30.764056275s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.333715343 +0000 UTC m=+1088.486861481" lastFinishedPulling="2026-01-30 21:34:24.517188783 +0000 UTC m=+1115.670334931" observedRunningTime="2026-01-30 21:34:25.757345906 +0000 UTC m=+1116.910492044" watchObservedRunningTime="2026-01-30 21:34:25.764056275 +0000 UTC m=+1116.917202413" Jan 30 21:34:25 crc kubenswrapper[4834]: I0130 21:34:25.786108 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" podStartSLOduration=3.960882052 podStartE2EDuration="30.786091064s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.660108195 +0000 UTC m=+1088.813254333" lastFinishedPulling="2026-01-30 21:34:24.485317157 +0000 UTC m=+1115.638463345" observedRunningTime="2026-01-30 21:34:25.782884344 +0000 UTC m=+1116.936030482" watchObservedRunningTime="2026-01-30 21:34:25.786091064 +0000 UTC m=+1116.939237202" Jan 30 21:34:26 crc kubenswrapper[4834]: I0130 21:34:26.203964 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:34:27 crc kubenswrapper[4834]: I0130 21:34:27.067730 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-fgvhj" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.128805 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.137701 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0b534e5-be84-4fd0-a8f6-ee233988095e-cert\") pod \"infra-operator-controller-manager-79955696d6-6rdsk\" (UID: \"b0b534e5-be84-4fd0-a8f6-ee233988095e\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.383749 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.433718 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.439067 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ba55708-11fd-4a17-9a95-88fd28711fb6-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x\" (UID: \"1ba55708-11fd-4a17-9a95-88fd28711fb6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.547956 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.747047 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" event={"ID":"4d4e0469-0167-410e-9bfd-26f81b9900cf","Type":"ContainerStarted","Data":"2de2cc0041bf492cd3bdaf14ff3d9d3823c8c88260bf8c1f87e1bf28f9a51d8f"} Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.747621 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.767206 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" podStartSLOduration=2.805878719 podStartE2EDuration="32.767189968s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:58.135356394 +0000 UTC m=+1089.288502522" lastFinishedPulling="2026-01-30 21:34:28.096667633 +0000 UTC m=+1119.249813771" observedRunningTime="2026-01-30 21:34:28.761086436 +0000 UTC m=+1119.914232574" watchObservedRunningTime="2026-01-30 21:34:28.767189968 +0000 UTC m=+1119.920336106" Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.863975 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk"] Jan 30 21:34:28 crc kubenswrapper[4834]: W0130 21:34:28.865330 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b534e5_be84_4fd0_a8f6_ee233988095e.slice/crio-9fe2c5c0ad4debd5e694dce38ae8c3909af25f55849a931d0947a6036fa04296 WatchSource:0}: Error finding container 9fe2c5c0ad4debd5e694dce38ae8c3909af25f55849a931d0947a6036fa04296: Status 404 returned error can't find the container with id 9fe2c5c0ad4debd5e694dce38ae8c3909af25f55849a931d0947a6036fa04296 Jan 30 21:34:28 crc kubenswrapper[4834]: W0130 21:34:28.990984 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba55708_11fd_4a17_9a95_88fd28711fb6.slice/crio-a8b8285907d30d4a76b5608764597c29933d2ef1acb288c4d7d025cb38629d2f WatchSource:0}: Error finding container a8b8285907d30d4a76b5608764597c29933d2ef1acb288c4d7d025cb38629d2f: Status 404 returned error can't find the container with id a8b8285907d30d4a76b5608764597c29933d2ef1acb288c4d7d025cb38629d2f Jan 30 21:34:28 crc kubenswrapper[4834]: I0130 21:34:28.994031 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x"] Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.045357 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.045484 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.052151 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-metrics-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.053495 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a9897e6e-a451-4b52-9135-dca4af64fbfb-webhook-certs\") pod \"openstack-operator-controller-manager-7d48698d88-q72fc\" (UID: \"a9897e6e-a451-4b52-9135-dca4af64fbfb\") " pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.112105 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.667698 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc"] Jan 30 21:34:29 crc kubenswrapper[4834]: W0130 21:34:29.686578 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9897e6e_a451_4b52_9135_dca4af64fbfb.slice/crio-5273d5dc5cc5f76bad02cd830cac4ff824838c5e13a30f27f05c907c3d062f4f WatchSource:0}: Error finding container 5273d5dc5cc5f76bad02cd830cac4ff824838c5e13a30f27f05c907c3d062f4f: Status 404 returned error can't find the container with id 5273d5dc5cc5f76bad02cd830cac4ff824838c5e13a30f27f05c907c3d062f4f Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.757952 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" event={"ID":"a9897e6e-a451-4b52-9135-dca4af64fbfb","Type":"ContainerStarted","Data":"5273d5dc5cc5f76bad02cd830cac4ff824838c5e13a30f27f05c907c3d062f4f"} Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.759773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" event={"ID":"b0b534e5-be84-4fd0-a8f6-ee233988095e","Type":"ContainerStarted","Data":"9fe2c5c0ad4debd5e694dce38ae8c3909af25f55849a931d0947a6036fa04296"} Jan 30 21:34:29 crc kubenswrapper[4834]: I0130 21:34:29.761446 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" event={"ID":"1ba55708-11fd-4a17-9a95-88fd28711fb6","Type":"ContainerStarted","Data":"a8b8285907d30d4a76b5608764597c29933d2ef1acb288c4d7d025cb38629d2f"} Jan 30 21:34:30 crc kubenswrapper[4834]: I0130 21:34:30.771675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" event={"ID":"a9897e6e-a451-4b52-9135-dca4af64fbfb","Type":"ContainerStarted","Data":"b288c8421ea1cfb7295cd48892019f5d329bf21885cb3cd822902870599f051e"} Jan 30 21:34:30 crc kubenswrapper[4834]: I0130 21:34:30.773013 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:30 crc kubenswrapper[4834]: I0130 21:34:30.816057 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" podStartSLOduration=34.81603189 podStartE2EDuration="34.81603189s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:34:30.799799753 +0000 UTC m=+1121.952945891" watchObservedRunningTime="2026-01-30 21:34:30.81603189 +0000 UTC m=+1121.969178048" Jan 30 21:34:34 crc kubenswrapper[4834]: I0130 21:34:34.161164 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:34:34 crc kubenswrapper[4834]: I0130 21:34:34.162022 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.148093 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-95x2s" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.176569 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-4crlf" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.193596 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-x8rnk" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.220514 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-hjdtw" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.372017 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-l2pv5" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.385298 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-pkzb7" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.461285 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-ccd87" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.657314 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-xqpmp" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.667879 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-xrvft" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.720976 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-45npx" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.813616 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-vc2wt" Jan 30 21:34:36 crc kubenswrapper[4834]: I0130 21:34:36.916748 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-lllql" Jan 30 21:34:37 crc kubenswrapper[4834]: I0130 21:34:37.000250 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-phxxg" Jan 30 21:34:37 crc kubenswrapper[4834]: I0130 21:34:37.157914 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6749767b8f-kk9tb" Jan 30 21:34:37 crc kubenswrapper[4834]: I0130 21:34:37.284638 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-x4j85" Jan 30 21:34:37 crc kubenswrapper[4834]: I0130 21:34:37.337663 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-zzf9h" Jan 30 21:34:39 crc kubenswrapper[4834]: I0130 21:34:39.123162 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d48698d88-q72fc" Jan 30 21:34:49 crc kubenswrapper[4834]: E0130 21:34:49.518422 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:89f6fd332fabefd2fff5619432986b37c1c6d197dd1c510f21dfe4609939b8a6" Jan 30 21:34:49 crc kubenswrapper[4834]: E0130 21:34:49.519973 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:89f6fd332fabefd2fff5619432986b37c1c6d197dd1c510f21dfe4609939b8a6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf5gg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x_openstack-operators(1ba55708-11fd-4a17-9a95-88fd28711fb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:34:49 crc kubenswrapper[4834]: E0130 21:34:49.521287 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" podUID="1ba55708-11fd-4a17-9a95-88fd28711fb6" Jan 30 21:34:50 crc kubenswrapper[4834]: E0130 21:34:50.668030 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:89f6fd332fabefd2fff5619432986b37c1c6d197dd1c510f21dfe4609939b8a6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" podUID="1ba55708-11fd-4a17-9a95-88fd28711fb6" Jan 30 21:34:50 crc kubenswrapper[4834]: I0130 21:34:50.988741 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" event={"ID":"b0b534e5-be84-4fd0-a8f6-ee233988095e","Type":"ContainerStarted","Data":"b9021475994bd721c68e562bbf717a7315a79f58851b082c80bd05d15e5095c6"} Jan 30 21:34:51 crc kubenswrapper[4834]: I0130 21:34:51.995974 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:34:52 crc kubenswrapper[4834]: I0130 21:34:52.019516 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" podStartSLOduration=35.200518242 podStartE2EDuration="57.019497596s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:34:28.867325392 +0000 UTC m=+1120.020471550" lastFinishedPulling="2026-01-30 21:34:50.686304756 +0000 UTC m=+1141.839450904" observedRunningTime="2026-01-30 21:34:52.014480525 +0000 UTC m=+1143.167626703" watchObservedRunningTime="2026-01-30 21:34:52.019497596 +0000 UTC m=+1143.172643744" Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.017612 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" event={"ID":"eea6da10-7c27-42c7-a532-f872f8b7c86a","Type":"ContainerStarted","Data":"e064e99dee0ed23728251c33ae049169b35690d1bae469a52368039d4ab920b3"} Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.018295 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.020995 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" event={"ID":"ec5f427f-4be8-4066-a817-c9e2e3df4e6f","Type":"ContainerStarted","Data":"0c184818f2d2e9aed8a0de23ba0e852c2b26d62243c4d1071c7bc31dd737503a"} Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.021225 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.023449 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" event={"ID":"b998e309-c037-436c-aed4-12298af019ac","Type":"ContainerStarted","Data":"223e6de06a9de9df8873ebd9eac48fe4405c73f223954b4bc391171e7cd7e459"} Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.047373 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" podStartSLOduration=2.796923397 podStartE2EDuration="57.047352024s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.625306098 +0000 UTC m=+1088.778452226" lastFinishedPulling="2026-01-30 21:34:51.875734685 +0000 UTC m=+1143.028880853" observedRunningTime="2026-01-30 21:34:53.03619291 +0000 UTC m=+1144.189339058" watchObservedRunningTime="2026-01-30 21:34:53.047352024 +0000 UTC m=+1144.200498172" Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.055239 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" podStartSLOduration=3.982880746 podStartE2EDuration="58.055213735s" podCreationTimestamp="2026-01-30 21:33:55 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.802899072 +0000 UTC m=+1088.956045210" lastFinishedPulling="2026-01-30 21:34:51.875232031 +0000 UTC m=+1143.028378199" observedRunningTime="2026-01-30 21:34:53.052829998 +0000 UTC m=+1144.205976146" watchObservedRunningTime="2026-01-30 21:34:53.055213735 +0000 UTC m=+1144.208359883" Jan 30 21:34:53 crc kubenswrapper[4834]: I0130 21:34:53.077030 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gxzqj" podStartSLOduration=3.174820244 podStartE2EDuration="57.077010177s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:33:57.974747666 +0000 UTC m=+1089.127893804" lastFinishedPulling="2026-01-30 21:34:51.876937559 +0000 UTC m=+1143.030083737" observedRunningTime="2026-01-30 21:34:53.073212411 +0000 UTC m=+1144.226358579" watchObservedRunningTime="2026-01-30 21:34:53.077010177 +0000 UTC m=+1144.230156325" Jan 30 21:34:58 crc kubenswrapper[4834]: I0130 21:34:58.394042 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-6rdsk" Jan 30 21:35:04 crc kubenswrapper[4834]: I0130 21:35:04.161579 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:35:04 crc kubenswrapper[4834]: I0130 21:35:04.162286 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:35:04 crc kubenswrapper[4834]: I0130 21:35:04.162352 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:35:04 crc kubenswrapper[4834]: I0130 21:35:04.163133 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ffc2f37f0663828c033fce2d59c1e7b940cbd240e347042d2341fa7fcc4ac6"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:35:04 crc kubenswrapper[4834]: I0130 21:35:04.163218 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://75ffc2f37f0663828c033fce2d59c1e7b940cbd240e347042d2341fa7fcc4ac6" gracePeriod=600 Jan 30 21:35:05 crc kubenswrapper[4834]: I0130 21:35:05.148156 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="75ffc2f37f0663828c033fce2d59c1e7b940cbd240e347042d2341fa7fcc4ac6" exitCode=0 Jan 30 21:35:05 crc kubenswrapper[4834]: I0130 21:35:05.148222 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"75ffc2f37f0663828c033fce2d59c1e7b940cbd240e347042d2341fa7fcc4ac6"} Jan 30 21:35:05 crc kubenswrapper[4834]: I0130 21:35:05.148531 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"8384069132f18eea6ac87d501b64935494bfc35764a079472903e1922c44d982"} Jan 30 21:35:05 crc kubenswrapper[4834]: I0130 21:35:05.148553 4834 scope.go:117] "RemoveContainer" containerID="4cb5a4bc85d48be6eae743481c416969d22a0b71074f3b32022fe8457dc9c32b" Jan 30 21:35:06 crc kubenswrapper[4834]: I0130 21:35:06.406153 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-n4kck" Jan 30 21:35:06 crc kubenswrapper[4834]: I0130 21:35:06.561532 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-889p5" Jan 30 21:35:07 crc kubenswrapper[4834]: I0130 21:35:07.170777 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" event={"ID":"1ba55708-11fd-4a17-9a95-88fd28711fb6","Type":"ContainerStarted","Data":"f0b48275669c19e37a0c76b4408d23a3df7de2bdbcdf9a1cb15770e6de0d3ba8"} Jan 30 21:35:07 crc kubenswrapper[4834]: I0130 21:35:07.171270 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:35:07 crc kubenswrapper[4834]: I0130 21:35:07.211015 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" podStartSLOduration=34.231356369 podStartE2EDuration="1m11.210993248s" podCreationTimestamp="2026-01-30 21:33:56 +0000 UTC" firstStartedPulling="2026-01-30 21:34:28.99390147 +0000 UTC m=+1120.147047628" lastFinishedPulling="2026-01-30 21:35:05.973538369 +0000 UTC m=+1157.126684507" observedRunningTime="2026-01-30 21:35:07.19969404 +0000 UTC m=+1158.352840198" watchObservedRunningTime="2026-01-30 21:35:07.210993248 +0000 UTC m=+1158.364139396" Jan 30 21:35:18 crc kubenswrapper[4834]: I0130 21:35:18.557341 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.144378 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4j4lj"] Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.146497 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.150107 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.150196 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lbp8f" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.150294 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.150373 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.162294 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4j4lj"] Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.203615 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxcf4"] Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.225678 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxcf4"] Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.225789 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.229121 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.285499 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6c22\" (UniqueName: \"kubernetes.io/projected/00ad01ad-0b62-47f9-becf-b80901d37b2b-kube-api-access-r6c22\") pod \"dnsmasq-dns-675f4bcbfc-4j4lj\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.285576 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ad01ad-0b62-47f9-becf-b80901d37b2b-config\") pod \"dnsmasq-dns-675f4bcbfc-4j4lj\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.387280 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.387335 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6c22\" (UniqueName: \"kubernetes.io/projected/00ad01ad-0b62-47f9-becf-b80901d37b2b-kube-api-access-r6c22\") pod \"dnsmasq-dns-675f4bcbfc-4j4lj\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.387366 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ad01ad-0b62-47f9-becf-b80901d37b2b-config\") pod \"dnsmasq-dns-675f4bcbfc-4j4lj\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.387386 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-config\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.387439 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxtkl\" (UniqueName: \"kubernetes.io/projected/a1bc6151-ca53-4769-b5e4-751fc6a51203-kube-api-access-qxtkl\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.389039 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ad01ad-0b62-47f9-becf-b80901d37b2b-config\") pod \"dnsmasq-dns-675f4bcbfc-4j4lj\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.414466 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6c22\" (UniqueName: \"kubernetes.io/projected/00ad01ad-0b62-47f9-becf-b80901d37b2b-kube-api-access-r6c22\") pod \"dnsmasq-dns-675f4bcbfc-4j4lj\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.470020 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.489071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.489158 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-config\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.489214 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxtkl\" (UniqueName: \"kubernetes.io/projected/a1bc6151-ca53-4769-b5e4-751fc6a51203-kube-api-access-qxtkl\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.490483 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.491098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-config\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.516760 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxtkl\" (UniqueName: \"kubernetes.io/projected/a1bc6151-ca53-4769-b5e4-751fc6a51203-kube-api-access-qxtkl\") pod \"dnsmasq-dns-78dd6ddcc-bxcf4\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.549976 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:35 crc kubenswrapper[4834]: I0130 21:35:35.996668 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4j4lj"] Jan 30 21:35:36 crc kubenswrapper[4834]: I0130 21:35:36.066069 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxcf4"] Jan 30 21:35:36 crc kubenswrapper[4834]: W0130 21:35:36.071703 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bc6151_ca53_4769_b5e4_751fc6a51203.slice/crio-cbbdfd0e4ea13a3033f1f6624ed2f61f8cc4ae780c70daff008d209a09bff6fe WatchSource:0}: Error finding container cbbdfd0e4ea13a3033f1f6624ed2f61f8cc4ae780c70daff008d209a09bff6fe: Status 404 returned error can't find the container with id cbbdfd0e4ea13a3033f1f6624ed2f61f8cc4ae780c70daff008d209a09bff6fe Jan 30 21:35:36 crc kubenswrapper[4834]: I0130 21:35:36.454079 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" event={"ID":"00ad01ad-0b62-47f9-becf-b80901d37b2b","Type":"ContainerStarted","Data":"fc88c67d37725ed8913b6609b6ce5bf8f2f69014d62822f5b1b1f8b3c97d154c"} Jan 30 21:35:36 crc kubenswrapper[4834]: I0130 21:35:36.456153 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" event={"ID":"a1bc6151-ca53-4769-b5e4-751fc6a51203","Type":"ContainerStarted","Data":"cbbdfd0e4ea13a3033f1f6624ed2f61f8cc4ae780c70daff008d209a09bff6fe"} Jan 30 21:35:37 crc kubenswrapper[4834]: I0130 21:35:37.911646 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4j4lj"] Jan 30 21:35:37 crc kubenswrapper[4834]: I0130 21:35:37.929858 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-q9vn9"] Jan 30 21:35:37 crc kubenswrapper[4834]: I0130 21:35:37.930924 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:37 crc kubenswrapper[4834]: I0130 21:35:37.944737 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-q9vn9"] Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.034015 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5wcr\" (UniqueName: \"kubernetes.io/projected/02bdb3db-66d6-4dfd-b101-24fa78cb6842-kube-api-access-h5wcr\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.034366 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-dns-svc\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.034480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-config\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.136459 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5wcr\" (UniqueName: \"kubernetes.io/projected/02bdb3db-66d6-4dfd-b101-24fa78cb6842-kube-api-access-h5wcr\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.136532 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-dns-svc\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.136570 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-config\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.137981 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-config\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.138311 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-dns-svc\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.158234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5wcr\" (UniqueName: \"kubernetes.io/projected/02bdb3db-66d6-4dfd-b101-24fa78cb6842-kube-api-access-h5wcr\") pod \"dnsmasq-dns-666b6646f7-q9vn9\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.235242 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxcf4"] Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.253654 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.266064 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ftq2c"] Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.272870 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.276654 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ftq2c"] Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.340980 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxscg\" (UniqueName: \"kubernetes.io/projected/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-kube-api-access-cxscg\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.341143 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.341187 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-config\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.443434 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxscg\" (UniqueName: \"kubernetes.io/projected/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-kube-api-access-cxscg\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.443500 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.443521 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-config\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.444736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-config\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.445005 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.476686 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxscg\" (UniqueName: \"kubernetes.io/projected/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-kube-api-access-cxscg\") pod \"dnsmasq-dns-57d769cc4f-ftq2c\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.593584 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:38 crc kubenswrapper[4834]: I0130 21:35:38.821523 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-q9vn9"] Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.083965 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.085687 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.087953 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.088192 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.088777 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.089224 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.089233 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.090233 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zw5br" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.091604 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.097830 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.126689 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ftq2c"] Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.157534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.157582 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.157606 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.157625 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd8c97eb-154c-451c-88ec-025f6148936c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.157844 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.157950 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd8c97eb-154c-451c-88ec-025f6148936c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.158028 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.158118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sb9g\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-kube-api-access-8sb9g\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.158199 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-config-data\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.158245 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.158267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: W0130 21:35:39.162478 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58dc8ec9_b8e2_4039_b2f3_f30c2706e3ad.slice/crio-ad17df70671c9f68b40fb4820143098ea225552c29d5b923941235eb7aa5d56f WatchSource:0}: Error finding container ad17df70671c9f68b40fb4820143098ea225552c29d5b923941235eb7aa5d56f: Status 404 returned error can't find the container with id ad17df70671c9f68b40fb4820143098ea225552c29d5b923941235eb7aa5d56f Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.261266 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.261321 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sb9g\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-kube-api-access-8sb9g\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.261356 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-config-data\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.261522 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.261544 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.261917 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.262032 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.262388 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-config-data\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.262993 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.262998 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263168 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263210 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd8c97eb-154c-451c-88ec-025f6148936c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263244 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263285 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263366 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd8c97eb-154c-451c-88ec-025f6148936c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.263581 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.270161 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.272639 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd8c97eb-154c-451c-88ec-025f6148936c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.273789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.282259 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd8c97eb-154c-451c-88ec-025f6148936c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.284074 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sb9g\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-kube-api-access-8sb9g\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.291275 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.413489 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.414884 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.416348 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.419852 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.420196 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.420351 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.420499 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.420646 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.420786 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-58hcq" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.423494 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.456808 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.493104 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" event={"ID":"02bdb3db-66d6-4dfd-b101-24fa78cb6842","Type":"ContainerStarted","Data":"b9b66c5559e1b8c93a54862279ea590441c277e3f8d70db1c2baf69e4f2cd745"} Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.496618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" event={"ID":"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad","Type":"ContainerStarted","Data":"ad17df70671c9f68b40fb4820143098ea225552c29d5b923941235eb7aa5d56f"} Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570194 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500f2414-6837-49ac-b834-06b5fd86d2b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570246 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570276 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570366 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500f2414-6837-49ac-b834-06b5fd86d2b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570389 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570443 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.570467 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdxk\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-kube-api-access-gmdxk\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672209 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672293 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672338 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500f2414-6837-49ac-b834-06b5fd86d2b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672360 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672423 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672445 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdxk\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-kube-api-access-gmdxk\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672521 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672598 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672626 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500f2414-6837-49ac-b834-06b5fd86d2b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672661 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672693 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.672896 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.673130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.673506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.673596 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.674046 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.674434 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.676888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500f2414-6837-49ac-b834-06b5fd86d2b8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.677135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.678427 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500f2414-6837-49ac-b834-06b5fd86d2b8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.680595 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.706357 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.714601 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdxk\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-kube-api-access-gmdxk\") pod \"rabbitmq-cell1-server-0\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.781777 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:35:39 crc kubenswrapper[4834]: I0130 21:35:39.974825 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:35:39 crc kubenswrapper[4834]: W0130 21:35:39.990059 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8c97eb_154c_451c_88ec_025f6148936c.slice/crio-489f056fc942ecca987917eb48d929baafa4c81ae36d64b4c3837ca2c2417de1 WatchSource:0}: Error finding container 489f056fc942ecca987917eb48d929baafa4c81ae36d64b4c3837ca2c2417de1: Status 404 returned error can't find the container with id 489f056fc942ecca987917eb48d929baafa4c81ae36d64b4c3837ca2c2417de1 Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.341175 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.522729 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"500f2414-6837-49ac-b834-06b5fd86d2b8","Type":"ContainerStarted","Data":"a5cfeed94bca20704ba339b4eb2a60ca5740a01ca6c4d7c5e22223c10f191049"} Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.525740 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bd8c97eb-154c-451c-88ec-025f6148936c","Type":"ContainerStarted","Data":"489f056fc942ecca987917eb48d929baafa4c81ae36d64b4c3837ca2c2417de1"} Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.627072 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.631713 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.640090 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-rd8ck" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.640675 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.643294 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.643728 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.647415 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.676653 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795523 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795581 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f917227d-3bb7-4402-9c62-a1ccd41b9782-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795634 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f917227d-3bb7-4402-9c62-a1ccd41b9782-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f917227d-3bb7-4402-9c62-a1ccd41b9782-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795703 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-kolla-config\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795763 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-config-data-default\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.795781 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpz4m\" (UniqueName: \"kubernetes.io/projected/f917227d-3bb7-4402-9c62-a1ccd41b9782-kube-api-access-fpz4m\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.897505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f917227d-3bb7-4402-9c62-a1ccd41b9782-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.897592 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f917227d-3bb7-4402-9c62-a1ccd41b9782-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.897632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.898764 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-kolla-config\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.899463 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f917227d-3bb7-4402-9c62-a1ccd41b9782-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.899487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-config-data-default\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.900472 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-kolla-config\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.900552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-config-data-default\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.901346 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.899550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpz4m\" (UniqueName: \"kubernetes.io/projected/f917227d-3bb7-4402-9c62-a1ccd41b9782-kube-api-access-fpz4m\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.905124 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.905210 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f917227d-3bb7-4402-9c62-a1ccd41b9782-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.908552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917227d-3bb7-4402-9c62-a1ccd41b9782-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.915305 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f917227d-3bb7-4402-9c62-a1ccd41b9782-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.915409 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f917227d-3bb7-4402-9c62-a1ccd41b9782-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.933832 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.944739 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpz4m\" (UniqueName: \"kubernetes.io/projected/f917227d-3bb7-4402-9c62-a1ccd41b9782-kube-api-access-fpz4m\") pod \"openstack-galera-0\" (UID: \"f917227d-3bb7-4402-9c62-a1ccd41b9782\") " pod="openstack/openstack-galera-0" Jan 30 21:35:40 crc kubenswrapper[4834]: I0130 21:35:40.993710 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.050448 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.052347 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.062740 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.069379 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.069754 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.069983 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4lcgm" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.070988 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.243949 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244015 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244045 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244113 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244139 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrzh\" (UniqueName: \"kubernetes.io/projected/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-kube-api-access-6wrzh\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244195 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.244309 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.334961 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.335944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.342212 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.342585 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pcfw7" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.342753 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.345720 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.345800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.345845 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.345891 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.345919 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.345940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.346002 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.346027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrzh\" (UniqueName: \"kubernetes.io/projected/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-kube-api-access-6wrzh\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.346262 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.346913 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.347532 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.348055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.348077 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.355679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.362409 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.373028 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrzh\" (UniqueName: \"kubernetes.io/projected/067a1cfb-a1ba-43f7-8669-b233b41cdbd7-kube-api-access-6wrzh\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.404797 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.419584 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"067a1cfb-a1ba-43f7-8669-b233b41cdbd7\") " pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.447920 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af27dc41-8d8b-4471-8481-ca32766a9344-kolla-config\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.448186 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af27dc41-8d8b-4471-8481-ca32766a9344-config-data\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.448361 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af27dc41-8d8b-4471-8481-ca32766a9344-combined-ca-bundle\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.448461 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fn2\" (UniqueName: \"kubernetes.io/projected/af27dc41-8d8b-4471-8481-ca32766a9344-kube-api-access-p2fn2\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.448538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/af27dc41-8d8b-4471-8481-ca32766a9344-memcached-tls-certs\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.550805 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/af27dc41-8d8b-4471-8481-ca32766a9344-memcached-tls-certs\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.550875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af27dc41-8d8b-4471-8481-ca32766a9344-kolla-config\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.550943 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af27dc41-8d8b-4471-8481-ca32766a9344-config-data\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.550988 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af27dc41-8d8b-4471-8481-ca32766a9344-combined-ca-bundle\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.551012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2fn2\" (UniqueName: \"kubernetes.io/projected/af27dc41-8d8b-4471-8481-ca32766a9344-kube-api-access-p2fn2\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.551799 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/af27dc41-8d8b-4471-8481-ca32766a9344-kolla-config\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.551791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/af27dc41-8d8b-4471-8481-ca32766a9344-config-data\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.556831 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/af27dc41-8d8b-4471-8481-ca32766a9344-memcached-tls-certs\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.560679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af27dc41-8d8b-4471-8481-ca32766a9344-combined-ca-bundle\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.579998 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2fn2\" (UniqueName: \"kubernetes.io/projected/af27dc41-8d8b-4471-8481-ca32766a9344-kube-api-access-p2fn2\") pod \"memcached-0\" (UID: \"af27dc41-8d8b-4471-8481-ca32766a9344\") " pod="openstack/memcached-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.674836 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 30 21:35:42 crc kubenswrapper[4834]: I0130 21:35:42.707692 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.194090 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.195540 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.198796 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-r8q77" Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.209317 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.380366 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcfqw\" (UniqueName: \"kubernetes.io/projected/8086ba30-2087-423e-835a-78c41737a883-kube-api-access-jcfqw\") pod \"kube-state-metrics-0\" (UID: \"8086ba30-2087-423e-835a-78c41737a883\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.481563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcfqw\" (UniqueName: \"kubernetes.io/projected/8086ba30-2087-423e-835a-78c41737a883-kube-api-access-jcfqw\") pod \"kube-state-metrics-0\" (UID: \"8086ba30-2087-423e-835a-78c41737a883\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.502085 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcfqw\" (UniqueName: \"kubernetes.io/projected/8086ba30-2087-423e-835a-78c41737a883-kube-api-access-jcfqw\") pod \"kube-state-metrics-0\" (UID: \"8086ba30-2087-423e-835a-78c41737a883\") " pod="openstack/kube-state-metrics-0" Jan 30 21:35:44 crc kubenswrapper[4834]: I0130 21:35:44.525040 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:35:47 crc kubenswrapper[4834]: I0130 21:35:47.987836 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t2k4l"] Jan 30 21:35:47 crc kubenswrapper[4834]: I0130 21:35:47.989581 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:47 crc kubenswrapper[4834]: I0130 21:35:47.992823 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t2k4l"] Jan 30 21:35:47 crc kubenswrapper[4834]: I0130 21:35:47.992903 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-whsmn" Jan 30 21:35:47 crc kubenswrapper[4834]: I0130 21:35:47.993079 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 30 21:35:47 crc kubenswrapper[4834]: I0130 21:35:47.993165 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.036872 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qxr8t"] Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.038727 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.047103 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qxr8t"] Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.062474 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-lib\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.062602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-scripts\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.062690 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-etc-ovs\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.062764 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-log\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.062817 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-run\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.062951 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56znc\" (UniqueName: \"kubernetes.io/projected/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-kube-api-access-56znc\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.163816 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56znc\" (UniqueName: \"kubernetes.io/projected/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-kube-api-access-56znc\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.163859 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493ce910-9c99-49f5-85eb-3917715c87b6-scripts\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.163905 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-run\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.163941 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wxzq\" (UniqueName: \"kubernetes.io/projected/493ce910-9c99-49f5-85eb-3917715c87b6-kube-api-access-6wxzq\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.163965 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-lib\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.163997 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-scripts\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164013 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce910-9c99-49f5-85eb-3917715c87b6-combined-ca-bundle\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164028 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-etc-ovs\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164053 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-log\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164072 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-run\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164091 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-run-ovn\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-log-ovn\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164131 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce910-9c99-49f5-85eb-3917715c87b6-ovn-controller-tls-certs\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164543 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-log\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164630 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-run\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164677 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-etc-ovs\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.164850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-var-lib\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.167059 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-scripts\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.184964 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56znc\" (UniqueName: \"kubernetes.io/projected/1ddc19c3-1c4a-43e5-9d87-565575ba3ac1-kube-api-access-56znc\") pod \"ovn-controller-ovs-qxr8t\" (UID: \"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1\") " pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265153 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493ce910-9c99-49f5-85eb-3917715c87b6-scripts\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265213 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-run\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265330 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wxzq\" (UniqueName: \"kubernetes.io/projected/493ce910-9c99-49f5-85eb-3917715c87b6-kube-api-access-6wxzq\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265384 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce910-9c99-49f5-85eb-3917715c87b6-combined-ca-bundle\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265726 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-run-ovn\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-log-ovn\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265771 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce910-9c99-49f5-85eb-3917715c87b6-ovn-controller-tls-certs\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.265461 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-run\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.266229 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-run-ovn\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.266268 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/493ce910-9c99-49f5-85eb-3917715c87b6-var-log-ovn\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.267486 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/493ce910-9c99-49f5-85eb-3917715c87b6-scripts\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.269380 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/493ce910-9c99-49f5-85eb-3917715c87b6-ovn-controller-tls-certs\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.280010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/493ce910-9c99-49f5-85eb-3917715c87b6-combined-ca-bundle\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.282173 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wxzq\" (UniqueName: \"kubernetes.io/projected/493ce910-9c99-49f5-85eb-3917715c87b6-kube-api-access-6wxzq\") pod \"ovn-controller-t2k4l\" (UID: \"493ce910-9c99-49f5-85eb-3917715c87b6\") " pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.316734 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t2k4l" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.358093 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.885547 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.887338 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.891495 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.891579 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.892088 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.892173 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.892503 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ghs4q" Jan 30 21:35:48 crc kubenswrapper[4834]: I0130 21:35:48.895463 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.078740 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63838846-58d0-40d4-a246-7f7cc5012673-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.081300 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.081327 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63838846-58d0-40d4-a246-7f7cc5012673-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.081341 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63838846-58d0-40d4-a246-7f7cc5012673-config\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.081364 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.081429 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.083475 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nth77\" (UniqueName: \"kubernetes.io/projected/63838846-58d0-40d4-a246-7f7cc5012673-kube-api-access-nth77\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.083536 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63838846-58d0-40d4-a246-7f7cc5012673-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185581 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63838846-58d0-40d4-a246-7f7cc5012673-config\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185603 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63838846-58d0-40d4-a246-7f7cc5012673-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185630 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185703 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185792 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63838846-58d0-40d4-a246-7f7cc5012673-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185826 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nth77\" (UniqueName: \"kubernetes.io/projected/63838846-58d0-40d4-a246-7f7cc5012673-kube-api-access-nth77\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.185860 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.186492 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.188185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63838846-58d0-40d4-a246-7f7cc5012673-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.189317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63838846-58d0-40d4-a246-7f7cc5012673-config\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.193797 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.198800 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.205722 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63838846-58d0-40d4-a246-7f7cc5012673-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.206762 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nth77\" (UniqueName: \"kubernetes.io/projected/63838846-58d0-40d4-a246-7f7cc5012673-kube-api-access-nth77\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.221020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"63838846-58d0-40d4-a246-7f7cc5012673\") " pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:49 crc kubenswrapper[4834]: I0130 21:35:49.509028 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.835515 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.837983 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.842060 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.842784 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7n7zj" Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.844203 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.846497 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 30 21:35:51 crc kubenswrapper[4834]: I0130 21:35:51.854971 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039362 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-config\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039482 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039523 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039568 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgqp\" (UniqueName: \"kubernetes.io/projected/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-kube-api-access-qqgqp\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039637 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.039708 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141121 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-config\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141177 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141232 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141329 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgqp\" (UniqueName: \"kubernetes.io/projected/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-kube-api-access-qqgqp\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141383 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141482 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.141514 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.142000 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-config\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.142643 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.145017 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.148935 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.149741 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.158907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.161181 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgqp\" (UniqueName: \"kubernetes.io/projected/0cc2729f-00c7-4137-b6a1-9dfd0e01c60a-kube-api-access-qqgqp\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.171836 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a\") " pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:52 crc kubenswrapper[4834]: I0130 21:35:52.471808 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.498198 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.498776 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxtkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bxcf4_openstack(a1bc6151-ca53-4769-b5e4-751fc6a51203): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.499997 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" podUID="a1bc6151-ca53-4769-b5e4-751fc6a51203" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.520606 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.520782 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5wcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-q9vn9_openstack(02bdb3db-66d6-4dfd-b101-24fa78cb6842): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.521955 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" podUID="02bdb3db-66d6-4dfd-b101-24fa78cb6842" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.538068 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.538267 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r6c22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-4j4lj_openstack(00ad01ad-0b62-47f9-becf-b80901d37b2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.540850 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" podUID="00ad01ad-0b62-47f9-becf-b80901d37b2b" Jan 30 21:35:57 crc kubenswrapper[4834]: I0130 21:35:57.931899 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.988430 4834 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 30 21:35:57 crc kubenswrapper[4834]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/02bdb3db-66d6-4dfd-b101-24fa78cb6842/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 21:35:57 crc kubenswrapper[4834]: > podSandboxID="b9b66c5559e1b8c93a54862279ea590441c277e3f8d70db1c2baf69e4f2cd745" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.988524 4834 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 30 21:35:57 crc kubenswrapper[4834]: init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5wcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-q9vn9_openstack(02bdb3db-66d6-4dfd-b101-24fa78cb6842): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/02bdb3db-66d6-4dfd-b101-24fa78cb6842/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 30 21:35:57 crc kubenswrapper[4834]: > logger="UnhandledError" Jan 30 21:35:57 crc kubenswrapper[4834]: E0130 21:35:57.989717 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/02bdb3db-66d6-4dfd-b101-24fa78cb6842/volume-subpaths/dns-svc/init/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" podUID="02bdb3db-66d6-4dfd-b101-24fa78cb6842" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.321831 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.385059 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.453567 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.457532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ad01ad-0b62-47f9-becf-b80901d37b2b-config\") pod \"00ad01ad-0b62-47f9-becf-b80901d37b2b\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.457681 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6c22\" (UniqueName: \"kubernetes.io/projected/00ad01ad-0b62-47f9-becf-b80901d37b2b-kube-api-access-r6c22\") pod \"00ad01ad-0b62-47f9-becf-b80901d37b2b\" (UID: \"00ad01ad-0b62-47f9-becf-b80901d37b2b\") " Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.459337 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ad01ad-0b62-47f9-becf-b80901d37b2b-config" (OuterVolumeSpecName: "config") pod "00ad01ad-0b62-47f9-becf-b80901d37b2b" (UID: "00ad01ad-0b62-47f9-becf-b80901d37b2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.468973 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ad01ad-0b62-47f9-becf-b80901d37b2b-kube-api-access-r6c22" (OuterVolumeSpecName: "kube-api-access-r6c22") pod "00ad01ad-0b62-47f9-becf-b80901d37b2b" (UID: "00ad01ad-0b62-47f9-becf-b80901d37b2b"). InnerVolumeSpecName "kube-api-access-r6c22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.519737 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.559134 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-config\") pod \"a1bc6151-ca53-4769-b5e4-751fc6a51203\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.559298 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-dns-svc\") pod \"a1bc6151-ca53-4769-b5e4-751fc6a51203\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.559347 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxtkl\" (UniqueName: \"kubernetes.io/projected/a1bc6151-ca53-4769-b5e4-751fc6a51203-kube-api-access-qxtkl\") pod \"a1bc6151-ca53-4769-b5e4-751fc6a51203\" (UID: \"a1bc6151-ca53-4769-b5e4-751fc6a51203\") " Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.559697 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6c22\" (UniqueName: \"kubernetes.io/projected/00ad01ad-0b62-47f9-becf-b80901d37b2b-kube-api-access-r6c22\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.559710 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ad01ad-0b62-47f9-becf-b80901d37b2b-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.561222 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-config" (OuterVolumeSpecName: "config") pod "a1bc6151-ca53-4769-b5e4-751fc6a51203" (UID: "a1bc6151-ca53-4769-b5e4-751fc6a51203"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.562703 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1bc6151-ca53-4769-b5e4-751fc6a51203" (UID: "a1bc6151-ca53-4769-b5e4-751fc6a51203"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.572100 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.578199 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.595042 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1bc6151-ca53-4769-b5e4-751fc6a51203-kube-api-access-qxtkl" (OuterVolumeSpecName: "kube-api-access-qxtkl") pod "a1bc6151-ca53-4769-b5e4-751fc6a51203" (UID: "a1bc6151-ca53-4769-b5e4-751fc6a51203"). InnerVolumeSpecName "kube-api-access-qxtkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:35:58 crc kubenswrapper[4834]: W0130 21:35:58.603302 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8086ba30_2087_423e_835a_78c41737a883.slice/crio-88603fbacf25d125b9cde470f1fcb6339439a3cb9c78ad2ad98474ed81764cf4 WatchSource:0}: Error finding container 88603fbacf25d125b9cde470f1fcb6339439a3cb9c78ad2ad98474ed81764cf4: Status 404 returned error can't find the container with id 88603fbacf25d125b9cde470f1fcb6339439a3cb9c78ad2ad98474ed81764cf4 Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.621133 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 30 21:35:58 crc kubenswrapper[4834]: W0130 21:35:58.625310 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63838846_58d0_40d4_a246_7f7cc5012673.slice/crio-6e7e44e3e261aafbd1ad07824e48ecfdab60418edd6699b07e4391f29e76ddcb WatchSource:0}: Error finding container 6e7e44e3e261aafbd1ad07824e48ecfdab60418edd6699b07e4391f29e76ddcb: Status 404 returned error can't find the container with id 6e7e44e3e261aafbd1ad07824e48ecfdab60418edd6699b07e4391f29e76ddcb Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.665232 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.665261 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1bc6151-ca53-4769-b5e4-751fc6a51203-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.665274 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxtkl\" (UniqueName: \"kubernetes.io/projected/a1bc6151-ca53-4769-b5e4-751fc6a51203-kube-api-access-qxtkl\") on node \"crc\" DevicePath \"\"" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.675978 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t2k4l"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.680534 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63838846-58d0-40d4-a246-7f7cc5012673","Type":"ContainerStarted","Data":"6e7e44e3e261aafbd1ad07824e48ecfdab60418edd6699b07e4391f29e76ddcb"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.682322 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.683296 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4j4lj" event={"ID":"00ad01ad-0b62-47f9-becf-b80901d37b2b","Type":"ContainerDied","Data":"fc88c67d37725ed8913b6609b6ce5bf8f2f69014d62822f5b1b1f8b3c97d154c"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.685152 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a","Type":"ContainerStarted","Data":"20ce4c42a7ea611d0ce447e79004af10d375b384f98a6e56801d74a33a3c1587"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.694206 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f917227d-3bb7-4402-9c62-a1ccd41b9782","Type":"ContainerStarted","Data":"862972666d773975a4cb24c6b54bacee476c835268ef80a8e1abe080c6f9ad03"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.695353 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8086ba30-2087-423e-835a-78c41737a883","Type":"ContainerStarted","Data":"88603fbacf25d125b9cde470f1fcb6339439a3cb9c78ad2ad98474ed81764cf4"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.696517 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"af27dc41-8d8b-4471-8481-ca32766a9344","Type":"ContainerStarted","Data":"2b54f76d99ae99903e760657dd6740ab4677bcc623fc463e8d0f2022b4aa739c"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.697934 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" event={"ID":"a1bc6151-ca53-4769-b5e4-751fc6a51203","Type":"ContainerDied","Data":"cbbdfd0e4ea13a3033f1f6624ed2f61f8cc4ae780c70daff008d209a09bff6fe"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.698076 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bxcf4" Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.703804 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067a1cfb-a1ba-43f7-8669-b233b41cdbd7","Type":"ContainerStarted","Data":"cc70a2a0c4e2c5662acbb2f3c760527f9924fc8a6fab5476c4019ed4a61fe4a6"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.706042 4834 generic.go:334] "Generic (PLEG): container finished" podID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerID="3c39b55fbf0eea3af69632e44d754a0c579796a31ec2924ec52cf0165ab3875e" exitCode=0 Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.706095 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" event={"ID":"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad","Type":"ContainerDied","Data":"3c39b55fbf0eea3af69632e44d754a0c579796a31ec2924ec52cf0165ab3875e"} Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.741164 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qxr8t"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.790560 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4j4lj"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.803346 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4j4lj"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.821743 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxcf4"] Jan 30 21:35:58 crc kubenswrapper[4834]: I0130 21:35:58.827492 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bxcf4"] Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.548982 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ad01ad-0b62-47f9-becf-b80901d37b2b" path="/var/lib/kubelet/pods/00ad01ad-0b62-47f9-becf-b80901d37b2b/volumes" Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.549651 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1bc6151-ca53-4769-b5e4-751fc6a51203" path="/var/lib/kubelet/pods/a1bc6151-ca53-4769-b5e4-751fc6a51203/volumes" Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.754364 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bd8c97eb-154c-451c-88ec-025f6148936c","Type":"ContainerStarted","Data":"b5f40a5302db9286d8889928391704f4344288840c59da33e134a6c141d1f7cc"} Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.767716 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t2k4l" event={"ID":"493ce910-9c99-49f5-85eb-3917715c87b6","Type":"ContainerStarted","Data":"83ee3b529773d746d6b889a9325d0cf33c3945436aa9b71615062a93047dfc06"} Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.769876 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxr8t" event={"ID":"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1","Type":"ContainerStarted","Data":"a4d46db1e2c41ae947b2df02f94a30c647c8dc8d1d5ecc9e2179bd34220e29c7"} Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.778099 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"500f2414-6837-49ac-b834-06b5fd86d2b8","Type":"ContainerStarted","Data":"298b0d54929de73c138aeb7f470a15fc04c1aa6fa18b6a5212045046eec7737f"} Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.789323 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" event={"ID":"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad","Type":"ContainerStarted","Data":"c2d9592aade1d345238f6fcf85da09172b99d3c4c632ae835ea8a0d0009d742b"} Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.789675 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:35:59 crc kubenswrapper[4834]: I0130 21:35:59.831757 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" podStartSLOduration=3.3749729520000002 podStartE2EDuration="21.831741714s" podCreationTimestamp="2026-01-30 21:35:38 +0000 UTC" firstStartedPulling="2026-01-30 21:35:39.166189765 +0000 UTC m=+1190.319335903" lastFinishedPulling="2026-01-30 21:35:57.622958527 +0000 UTC m=+1208.776104665" observedRunningTime="2026-01-30 21:35:59.826927989 +0000 UTC m=+1210.980074127" watchObservedRunningTime="2026-01-30 21:35:59.831741714 +0000 UTC m=+1210.984887852" Jan 30 21:36:08 crc kubenswrapper[4834]: I0130 21:36:08.594647 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:36:08 crc kubenswrapper[4834]: I0130 21:36:08.658275 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-q9vn9"] Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.345882 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.494630 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5wcr\" (UniqueName: \"kubernetes.io/projected/02bdb3db-66d6-4dfd-b101-24fa78cb6842-kube-api-access-h5wcr\") pod \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.494813 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-dns-svc\") pod \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.494867 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-config\") pod \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\" (UID: \"02bdb3db-66d6-4dfd-b101-24fa78cb6842\") " Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.501966 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bdb3db-66d6-4dfd-b101-24fa78cb6842-kube-api-access-h5wcr" (OuterVolumeSpecName: "kube-api-access-h5wcr") pod "02bdb3db-66d6-4dfd-b101-24fa78cb6842" (UID: "02bdb3db-66d6-4dfd-b101-24fa78cb6842"). InnerVolumeSpecName "kube-api-access-h5wcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.517683 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-config" (OuterVolumeSpecName: "config") pod "02bdb3db-66d6-4dfd-b101-24fa78cb6842" (UID: "02bdb3db-66d6-4dfd-b101-24fa78cb6842"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.525700 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02bdb3db-66d6-4dfd-b101-24fa78cb6842" (UID: "02bdb3db-66d6-4dfd-b101-24fa78cb6842"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.597508 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.597551 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bdb3db-66d6-4dfd-b101-24fa78cb6842-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.597561 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5wcr\" (UniqueName: \"kubernetes.io/projected/02bdb3db-66d6-4dfd-b101-24fa78cb6842-kube-api-access-h5wcr\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.870035 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" event={"ID":"02bdb3db-66d6-4dfd-b101-24fa78cb6842","Type":"ContainerDied","Data":"b9b66c5559e1b8c93a54862279ea590441c277e3f8d70db1c2baf69e4f2cd745"} Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.870127 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-q9vn9" Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.919968 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-q9vn9"] Jan 30 21:36:09 crc kubenswrapper[4834]: I0130 21:36:09.928033 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-q9vn9"] Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.543983 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bdb3db-66d6-4dfd-b101-24fa78cb6842" path="/var/lib/kubelet/pods/02bdb3db-66d6-4dfd-b101-24fa78cb6842/volumes" Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.896695 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a","Type":"ContainerStarted","Data":"cdfbb1418d984116369eba87994507b6b9cf678ad5cb0004c14b625d55d22b0a"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.899148 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f917227d-3bb7-4402-9c62-a1ccd41b9782","Type":"ContainerStarted","Data":"b3bdccec66b3d60a0f5f17909f819be0c7392c26d74268e6c601e2f655063828"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.902478 4834 generic.go:334] "Generic (PLEG): container finished" podID="1ddc19c3-1c4a-43e5-9d87-565575ba3ac1" containerID="cb3fd028fae5d82e237593a8dc0808e500eb0424df2e2569407729963c5cbd0a" exitCode=0 Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.903434 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxr8t" event={"ID":"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1","Type":"ContainerDied","Data":"cb3fd028fae5d82e237593a8dc0808e500eb0424df2e2569407729963c5cbd0a"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.905520 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067a1cfb-a1ba-43f7-8669-b233b41cdbd7","Type":"ContainerStarted","Data":"438b070b7c2c6ce8670b52b55361ed1eac9eaf9d6ef78cc90b7e90ff355bcbb7"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.907282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t2k4l" event={"ID":"493ce910-9c99-49f5-85eb-3917715c87b6","Type":"ContainerStarted","Data":"03bfc9a246d85a43bfb6a26d6b3c4959feaf546180aa7e195a2125ecfd7c7550"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.907846 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-t2k4l" Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.909838 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8086ba30-2087-423e-835a-78c41737a883","Type":"ContainerStarted","Data":"416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.910381 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.912084 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"af27dc41-8d8b-4471-8481-ca32766a9344","Type":"ContainerStarted","Data":"57de72af96e4c6bb5f5f1f0826e63846247febe2882f0d032e1d3164a15dca6d"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.922231 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.943316 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63838846-58d0-40d4-a246-7f7cc5012673","Type":"ContainerStarted","Data":"8b3747b4b061a36a84728b2be422ecb131ea1de7a55cf32d392b93a6da2f5498"} Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.975458 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-t2k4l" podStartSLOduration=14.857305612 podStartE2EDuration="24.975435539s" podCreationTimestamp="2026-01-30 21:35:47 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.679087259 +0000 UTC m=+1209.832233397" lastFinishedPulling="2026-01-30 21:36:08.797217176 +0000 UTC m=+1219.950363324" observedRunningTime="2026-01-30 21:36:11.955510119 +0000 UTC m=+1223.108656257" watchObservedRunningTime="2026-01-30 21:36:11.975435539 +0000 UTC m=+1223.128581677" Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.982689 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.839877944 podStartE2EDuration="27.982671713s" podCreationTimestamp="2026-01-30 21:35:44 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.607260481 +0000 UTC m=+1209.760406619" lastFinishedPulling="2026-01-30 21:36:10.75005425 +0000 UTC m=+1221.903200388" observedRunningTime="2026-01-30 21:36:11.96907093 +0000 UTC m=+1223.122217068" watchObservedRunningTime="2026-01-30 21:36:11.982671713 +0000 UTC m=+1223.135817851" Jan 30 21:36:11 crc kubenswrapper[4834]: I0130 21:36:11.999884 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.392019356 podStartE2EDuration="29.999855796s" podCreationTimestamp="2026-01-30 21:35:42 +0000 UTC" firstStartedPulling="2026-01-30 21:35:57.935255224 +0000 UTC m=+1209.088401352" lastFinishedPulling="2026-01-30 21:36:08.543091664 +0000 UTC m=+1219.696237792" observedRunningTime="2026-01-30 21:36:11.99291349 +0000 UTC m=+1223.146059628" watchObservedRunningTime="2026-01-30 21:36:11.999855796 +0000 UTC m=+1223.153001934" Jan 30 21:36:12 crc kubenswrapper[4834]: I0130 21:36:12.960029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxr8t" event={"ID":"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1","Type":"ContainerStarted","Data":"1a014f74aa1743c06deb99a5ccf96a8f4068f3e01be1813a85bbc5a7ffadb6e0"} Jan 30 21:36:15 crc kubenswrapper[4834]: I0130 21:36:15.007292 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxr8t" event={"ID":"1ddc19c3-1c4a-43e5-9d87-565575ba3ac1","Type":"ContainerStarted","Data":"b07fc07b6f20343a3123efd7ed5f2ab75ee1f29fe38d0c32a9e275d48fc1a8ed"} Jan 30 21:36:15 crc kubenswrapper[4834]: I0130 21:36:15.008055 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:36:15 crc kubenswrapper[4834]: I0130 21:36:15.008091 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:36:15 crc kubenswrapper[4834]: I0130 21:36:15.032194 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qxr8t" podStartSLOduration=17.038818007 podStartE2EDuration="27.032165178s" podCreationTimestamp="2026-01-30 21:35:48 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.805133981 +0000 UTC m=+1209.958280119" lastFinishedPulling="2026-01-30 21:36:08.798481152 +0000 UTC m=+1219.951627290" observedRunningTime="2026-01-30 21:36:15.024489002 +0000 UTC m=+1226.177635140" watchObservedRunningTime="2026-01-30 21:36:15.032165178 +0000 UTC m=+1226.185311316" Jan 30 21:36:17 crc kubenswrapper[4834]: I0130 21:36:17.708970 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 30 21:36:18 crc kubenswrapper[4834]: I0130 21:36:18.031704 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"63838846-58d0-40d4-a246-7f7cc5012673","Type":"ContainerStarted","Data":"da2c20ceefcaea1faef686a3a3da10c3e39383f5212bcf6fb8ab3b70256d63a4"} Jan 30 21:36:18 crc kubenswrapper[4834]: I0130 21:36:18.033552 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0cc2729f-00c7-4137-b6a1-9dfd0e01c60a","Type":"ContainerStarted","Data":"e514137b4e4e257b5a96a36b8ad2a10ffd6d1db09a33501e806035fae095f950"} Jan 30 21:36:18 crc kubenswrapper[4834]: I0130 21:36:18.051609 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.920183746 podStartE2EDuration="31.051595418s" podCreationTimestamp="2026-01-30 21:35:47 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.62785479 +0000 UTC m=+1209.781000928" lastFinishedPulling="2026-01-30 21:36:17.759266432 +0000 UTC m=+1228.912412600" observedRunningTime="2026-01-30 21:36:18.049474338 +0000 UTC m=+1229.202620476" watchObservedRunningTime="2026-01-30 21:36:18.051595418 +0000 UTC m=+1229.204741556" Jan 30 21:36:18 crc kubenswrapper[4834]: I0130 21:36:18.074363 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.854839909 podStartE2EDuration="28.074349327s" podCreationTimestamp="2026-01-30 21:35:50 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.529816414 +0000 UTC m=+1209.682962542" lastFinishedPulling="2026-01-30 21:36:17.749325822 +0000 UTC m=+1228.902471960" observedRunningTime="2026-01-30 21:36:18.068958636 +0000 UTC m=+1229.222104774" watchObservedRunningTime="2026-01-30 21:36:18.074349327 +0000 UTC m=+1229.227495465" Jan 30 21:36:19 crc kubenswrapper[4834]: I0130 21:36:19.473934 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 30 21:36:19 crc kubenswrapper[4834]: I0130 21:36:19.510453 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 30 21:36:19 crc kubenswrapper[4834]: I0130 21:36:19.510509 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 30 21:36:19 crc kubenswrapper[4834]: I0130 21:36:19.516054 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 30 21:36:19 crc kubenswrapper[4834]: I0130 21:36:19.556447 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.053837 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.128859 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.130995 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.425319 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6xvc2"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.429526 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.442917 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6xvc2"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.481615 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.546185 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-85gcz"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.547943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.548518 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-config\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.548563 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.548606 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59w8z\" (UniqueName: \"kubernetes.io/projected/cabef511-5eec-4740-9990-5c86aa75b75e-kube-api-access-59w8z\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.548649 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.549855 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.564450 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-85gcz"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.622820 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.630840 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.631263 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6xvc2"] Jan 30 21:36:20 crc kubenswrapper[4834]: E0130 21:36:20.632488 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-59w8z ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" podUID="cabef511-5eec-4740-9990-5c86aa75b75e" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.639095 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.639630 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.639936 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wfmzj" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.640812 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.646915 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.650225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b7f87d-7f27-4150-9542-ccf5985fd8c6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.650402 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.650550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-config\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.650668 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gg9f\" (UniqueName: \"kubernetes.io/projected/795d7189-87c6-410a-bd1e-aecf055a1719-kube-api-access-2gg9f\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.650776 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/73b7f87d-7f27-4150-9542-ccf5985fd8c6-ovn-rundir\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.650993 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651235 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59w8z\" (UniqueName: \"kubernetes.io/projected/cabef511-5eec-4740-9990-5c86aa75b75e-kube-api-access-59w8z\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651366 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795d7189-87c6-410a-bd1e-aecf055a1719-config\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/795d7189-87c6-410a-bd1e-aecf055a1719-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651761 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795d7189-87c6-410a-bd1e-aecf055a1719-scripts\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651858 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/73b7f87d-7f27-4150-9542-ccf5985fd8c6-ovs-rundir\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651951 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b7f87d-7f27-4150-9542-ccf5985fd8c6-config\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.652073 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4gf\" (UniqueName: \"kubernetes.io/projected/73b7f87d-7f27-4150-9542-ccf5985fd8c6-kube-api-access-vg4gf\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.652150 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.651536 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-config\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.652294 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.652398 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b7f87d-7f27-4150-9542-ccf5985fd8c6-combined-ca-bundle\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.653321 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.688054 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-gbj7z"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.689746 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.694286 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gbj7z"] Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.695662 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.701397 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59w8z\" (UniqueName: \"kubernetes.io/projected/cabef511-5eec-4740-9990-5c86aa75b75e-kube-api-access-59w8z\") pod \"dnsmasq-dns-6bc7876d45-6xvc2\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753466 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753509 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753530 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-dns-svc\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753559 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753579 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vnkg\" (UniqueName: \"kubernetes.io/projected/934f7f22-200f-4eca-8a12-dfca8511eb5c-kube-api-access-2vnkg\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753602 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795d7189-87c6-410a-bd1e-aecf055a1719-config\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/795d7189-87c6-410a-bd1e-aecf055a1719-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753815 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795d7189-87c6-410a-bd1e-aecf055a1719-scripts\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753847 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/73b7f87d-7f27-4150-9542-ccf5985fd8c6-ovs-rundir\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753868 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b7f87d-7f27-4150-9542-ccf5985fd8c6-config\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753942 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4gf\" (UniqueName: \"kubernetes.io/projected/73b7f87d-7f27-4150-9542-ccf5985fd8c6-kube-api-access-vg4gf\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753971 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.753991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b7f87d-7f27-4150-9542-ccf5985fd8c6-combined-ca-bundle\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754017 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b7f87d-7f27-4150-9542-ccf5985fd8c6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754062 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/73b7f87d-7f27-4150-9542-ccf5985fd8c6-ovs-rundir\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754133 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-config\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754199 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gg9f\" (UniqueName: \"kubernetes.io/projected/795d7189-87c6-410a-bd1e-aecf055a1719-kube-api-access-2gg9f\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754217 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/795d7189-87c6-410a-bd1e-aecf055a1719-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754233 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/73b7f87d-7f27-4150-9542-ccf5985fd8c6-ovn-rundir\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754430 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/73b7f87d-7f27-4150-9542-ccf5985fd8c6-ovn-rundir\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754569 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795d7189-87c6-410a-bd1e-aecf055a1719-config\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.754600 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73b7f87d-7f27-4150-9542-ccf5985fd8c6-config\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.755009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795d7189-87c6-410a-bd1e-aecf055a1719-scripts\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.757322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.757813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/73b7f87d-7f27-4150-9542-ccf5985fd8c6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.757856 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.758297 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795d7189-87c6-410a-bd1e-aecf055a1719-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.761824 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73b7f87d-7f27-4150-9542-ccf5985fd8c6-combined-ca-bundle\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.770641 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4gf\" (UniqueName: \"kubernetes.io/projected/73b7f87d-7f27-4150-9542-ccf5985fd8c6-kube-api-access-vg4gf\") pod \"ovn-controller-metrics-85gcz\" (UID: \"73b7f87d-7f27-4150-9542-ccf5985fd8c6\") " pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.772007 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gg9f\" (UniqueName: \"kubernetes.io/projected/795d7189-87c6-410a-bd1e-aecf055a1719-kube-api-access-2gg9f\") pod \"ovn-northd-0\" (UID: \"795d7189-87c6-410a-bd1e-aecf055a1719\") " pod="openstack/ovn-northd-0" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.855630 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-config\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.855736 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.855760 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-dns-svc\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.855790 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.855812 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vnkg\" (UniqueName: \"kubernetes.io/projected/934f7f22-200f-4eca-8a12-dfca8511eb5c-kube-api-access-2vnkg\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.856766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.856773 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-config\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.857259 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-dns-svc\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.857295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.868502 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-85gcz" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.872757 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vnkg\" (UniqueName: \"kubernetes.io/projected/934f7f22-200f-4eca-8a12-dfca8511eb5c-kube-api-access-2vnkg\") pod \"dnsmasq-dns-8554648995-gbj7z\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:20 crc kubenswrapper[4834]: I0130 21:36:20.956594 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.043942 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.080339 4834 generic.go:334] "Generic (PLEG): container finished" podID="f917227d-3bb7-4402-9c62-a1ccd41b9782" containerID="b3bdccec66b3d60a0f5f17909f819be0c7392c26d74268e6c601e2f655063828" exitCode=0 Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.080456 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f917227d-3bb7-4402-9c62-a1ccd41b9782","Type":"ContainerDied","Data":"b3bdccec66b3d60a0f5f17909f819be0c7392c26d74268e6c601e2f655063828"} Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.086570 4834 generic.go:334] "Generic (PLEG): container finished" podID="067a1cfb-a1ba-43f7-8669-b233b41cdbd7" containerID="438b070b7c2c6ce8670b52b55361ed1eac9eaf9d6ef78cc90b7e90ff355bcbb7" exitCode=0 Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.086816 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067a1cfb-a1ba-43f7-8669-b233b41cdbd7","Type":"ContainerDied","Data":"438b070b7c2c6ce8670b52b55361ed1eac9eaf9d6ef78cc90b7e90ff355bcbb7"} Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.087288 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.102521 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.267004 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-config\") pod \"cabef511-5eec-4740-9990-5c86aa75b75e\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.267441 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-ovsdbserver-sb\") pod \"cabef511-5eec-4740-9990-5c86aa75b75e\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.267555 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-dns-svc\") pod \"cabef511-5eec-4740-9990-5c86aa75b75e\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.267685 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59w8z\" (UniqueName: \"kubernetes.io/projected/cabef511-5eec-4740-9990-5c86aa75b75e-kube-api-access-59w8z\") pod \"cabef511-5eec-4740-9990-5c86aa75b75e\" (UID: \"cabef511-5eec-4740-9990-5c86aa75b75e\") " Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.279131 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cabef511-5eec-4740-9990-5c86aa75b75e" (UID: "cabef511-5eec-4740-9990-5c86aa75b75e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.279496 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cabef511-5eec-4740-9990-5c86aa75b75e" (UID: "cabef511-5eec-4740-9990-5c86aa75b75e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.282915 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-config" (OuterVolumeSpecName: "config") pod "cabef511-5eec-4740-9990-5c86aa75b75e" (UID: "cabef511-5eec-4740-9990-5c86aa75b75e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.290610 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cabef511-5eec-4740-9990-5c86aa75b75e-kube-api-access-59w8z" (OuterVolumeSpecName: "kube-api-access-59w8z") pod "cabef511-5eec-4740-9990-5c86aa75b75e" (UID: "cabef511-5eec-4740-9990-5c86aa75b75e"). InnerVolumeSpecName "kube-api-access-59w8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.369826 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.369863 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59w8z\" (UniqueName: \"kubernetes.io/projected/cabef511-5eec-4740-9990-5c86aa75b75e-kube-api-access-59w8z\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.369877 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.369888 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cabef511-5eec-4740-9990-5c86aa75b75e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.383766 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-85gcz"] Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.491016 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 30 21:36:21 crc kubenswrapper[4834]: W0130 21:36:21.500525 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795d7189_87c6_410a_bd1e_aecf055a1719.slice/crio-a39de37e1748e3a94ac405109af617c990aa380ef5528b8c65e79f5545a60d39 WatchSource:0}: Error finding container a39de37e1748e3a94ac405109af617c990aa380ef5528b8c65e79f5545a60d39: Status 404 returned error can't find the container with id a39de37e1748e3a94ac405109af617c990aa380ef5528b8c65e79f5545a60d39 Jan 30 21:36:21 crc kubenswrapper[4834]: I0130 21:36:21.685894 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gbj7z"] Jan 30 21:36:21 crc kubenswrapper[4834]: W0130 21:36:21.691825 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod934f7f22_200f_4eca_8a12_dfca8511eb5c.slice/crio-65b2c7ff7ae725ddb6096af6c97cc36219e7d690482d0de906242640169c9bf6 WatchSource:0}: Error finding container 65b2c7ff7ae725ddb6096af6c97cc36219e7d690482d0de906242640169c9bf6: Status 404 returned error can't find the container with id 65b2c7ff7ae725ddb6096af6c97cc36219e7d690482d0de906242640169c9bf6 Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.097468 4834 generic.go:334] "Generic (PLEG): container finished" podID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerID="9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0" exitCode=0 Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.097518 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gbj7z" event={"ID":"934f7f22-200f-4eca-8a12-dfca8511eb5c","Type":"ContainerDied","Data":"9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.097949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gbj7z" event={"ID":"934f7f22-200f-4eca-8a12-dfca8511eb5c","Type":"ContainerStarted","Data":"65b2c7ff7ae725ddb6096af6c97cc36219e7d690482d0de906242640169c9bf6"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.100261 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"067a1cfb-a1ba-43f7-8669-b233b41cdbd7","Type":"ContainerStarted","Data":"af3e361466f360098219ca30f0d8a31911c2ec54eba00f753c678266c6b22f6f"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.103091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-85gcz" event={"ID":"73b7f87d-7f27-4150-9542-ccf5985fd8c6","Type":"ContainerStarted","Data":"2132cd698463901fc0eeecbb68cf5ff1ce2d6d8763bcd24ff4e0c54ada96579d"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.103123 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-85gcz" event={"ID":"73b7f87d-7f27-4150-9542-ccf5985fd8c6","Type":"ContainerStarted","Data":"7c5792016233f9f39694a69c2a3382f459037462d8b697b088b303c7c140b1e7"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.104164 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"795d7189-87c6-410a-bd1e-aecf055a1719","Type":"ContainerStarted","Data":"a39de37e1748e3a94ac405109af617c990aa380ef5528b8c65e79f5545a60d39"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.106748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f917227d-3bb7-4402-9c62-a1ccd41b9782","Type":"ContainerStarted","Data":"5a297e7581f93467b8a93afcf28ab24067e26d4d98c99e50c676a208084b44b0"} Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.106794 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-6xvc2" Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.155358 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.74630684 podStartE2EDuration="43.155338162s" podCreationTimestamp="2026-01-30 21:35:39 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.388183954 +0000 UTC m=+1209.541330082" lastFinishedPulling="2026-01-30 21:36:08.797215266 +0000 UTC m=+1219.950361404" observedRunningTime="2026-01-30 21:36:22.141561775 +0000 UTC m=+1233.294707913" watchObservedRunningTime="2026-01-30 21:36:22.155338162 +0000 UTC m=+1233.308484300" Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.169508 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.982216811 podStartE2EDuration="41.16946678s" podCreationTimestamp="2026-01-30 21:35:41 +0000 UTC" firstStartedPulling="2026-01-30 21:35:58.607011634 +0000 UTC m=+1209.760157772" lastFinishedPulling="2026-01-30 21:36:08.794261603 +0000 UTC m=+1219.947407741" observedRunningTime="2026-01-30 21:36:22.163741259 +0000 UTC m=+1233.316887427" watchObservedRunningTime="2026-01-30 21:36:22.16946678 +0000 UTC m=+1233.322612918" Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.185688 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-85gcz" podStartSLOduration=2.185668565 podStartE2EDuration="2.185668565s" podCreationTimestamp="2026-01-30 21:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:22.184994926 +0000 UTC m=+1233.338141064" watchObservedRunningTime="2026-01-30 21:36:22.185668565 +0000 UTC m=+1233.338814703" Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.282461 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6xvc2"] Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.298590 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-6xvc2"] Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.676024 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 30 21:36:22 crc kubenswrapper[4834]: I0130 21:36:22.676337 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 30 21:36:23 crc kubenswrapper[4834]: I0130 21:36:23.120008 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gbj7z" event={"ID":"934f7f22-200f-4eca-8a12-dfca8511eb5c","Type":"ContainerStarted","Data":"476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4"} Jan 30 21:36:23 crc kubenswrapper[4834]: I0130 21:36:23.150589 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-gbj7z" podStartSLOduration=3.150563973 podStartE2EDuration="3.150563973s" podCreationTimestamp="2026-01-30 21:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:23.150515422 +0000 UTC m=+1234.303661570" watchObservedRunningTime="2026-01-30 21:36:23.150563973 +0000 UTC m=+1234.303710121" Jan 30 21:36:23 crc kubenswrapper[4834]: I0130 21:36:23.541793 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cabef511-5eec-4740-9990-5c86aa75b75e" path="/var/lib/kubelet/pods/cabef511-5eec-4740-9990-5c86aa75b75e/volumes" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.130508 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"795d7189-87c6-410a-bd1e-aecf055a1719","Type":"ContainerStarted","Data":"072ccf2e73e8f8838218671a917db4db8d57ecc7c0b3ed001b36af83cfd33895"} Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.130840 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.130855 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"795d7189-87c6-410a-bd1e-aecf055a1719","Type":"ContainerStarted","Data":"f2b83cd72de0bf91be31d55767a588fc6c03e2393a5ee74a0901d1e8abda4ed0"} Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.130876 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.153053 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.578893125 podStartE2EDuration="4.153028506s" podCreationTimestamp="2026-01-30 21:36:20 +0000 UTC" firstStartedPulling="2026-01-30 21:36:21.503139492 +0000 UTC m=+1232.656285630" lastFinishedPulling="2026-01-30 21:36:23.077274863 +0000 UTC m=+1234.230421011" observedRunningTime="2026-01-30 21:36:24.145900576 +0000 UTC m=+1235.299046724" watchObservedRunningTime="2026-01-30 21:36:24.153028506 +0000 UTC m=+1235.306174684" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.510870 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gbj7z"] Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.572855 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.576872 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g4wpn"] Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.592519 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.605968 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g4wpn"] Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.733144 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.733297 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.733324 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhnm\" (UniqueName: \"kubernetes.io/projected/3321163d-92e3-4a02-8700-5d403f01f247-kube-api-access-rdhnm\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.733353 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.733374 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-config\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.834759 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.834805 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhnm\" (UniqueName: \"kubernetes.io/projected/3321163d-92e3-4a02-8700-5d403f01f247-kube-api-access-rdhnm\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.834829 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.834846 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-config\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.834903 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.835597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.835734 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-config\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.836064 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.836660 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.860044 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhnm\" (UniqueName: \"kubernetes.io/projected/3321163d-92e3-4a02-8700-5d403f01f247-kube-api-access-rdhnm\") pod \"dnsmasq-dns-b8fbc5445-g4wpn\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:24 crc kubenswrapper[4834]: I0130 21:36:24.926560 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.398319 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g4wpn"] Jan 30 21:36:25 crc kubenswrapper[4834]: W0130 21:36:25.406637 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3321163d_92e3_4a02_8700_5d403f01f247.slice/crio-754287b86e5c3b741e1b5e35b824c2efa2fe1138cbcb896e9eb0cdc3f52d218a WatchSource:0}: Error finding container 754287b86e5c3b741e1b5e35b824c2efa2fe1138cbcb896e9eb0cdc3f52d218a: Status 404 returned error can't find the container with id 754287b86e5c3b741e1b5e35b824c2efa2fe1138cbcb896e9eb0cdc3f52d218a Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.772048 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.777923 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.780151 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-glv6k" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.780154 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.788532 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.789013 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.796389 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.852620 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/070baa9f-0897-4fe2-bc14-68a831d81dce-lock\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.852686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.852778 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.852868 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070baa9f-0897-4fe2-bc14-68a831d81dce-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.852915 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/070baa9f-0897-4fe2-bc14-68a831d81dce-cache\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.852945 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7cd\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-kube-api-access-zc7cd\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.953736 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/070baa9f-0897-4fe2-bc14-68a831d81dce-cache\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.953803 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7cd\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-kube-api-access-zc7cd\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.953843 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/070baa9f-0897-4fe2-bc14-68a831d81dce-lock\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.953871 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.953913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.953970 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070baa9f-0897-4fe2-bc14-68a831d81dce-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: E0130 21:36:25.954095 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:36:25 crc kubenswrapper[4834]: E0130 21:36:25.954145 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:36:25 crc kubenswrapper[4834]: E0130 21:36:25.954212 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift podName:070baa9f-0897-4fe2-bc14-68a831d81dce nodeName:}" failed. No retries permitted until 2026-01-30 21:36:26.454193858 +0000 UTC m=+1237.607339996 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift") pod "swift-storage-0" (UID: "070baa9f-0897-4fe2-bc14-68a831d81dce") : configmap "swift-ring-files" not found Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.954364 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/070baa9f-0897-4fe2-bc14-68a831d81dce-cache\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.954439 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.954597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/070baa9f-0897-4fe2-bc14-68a831d81dce-lock\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.960286 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/070baa9f-0897-4fe2-bc14-68a831d81dce-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.973188 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7cd\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-kube-api-access-zc7cd\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:25 crc kubenswrapper[4834]: I0130 21:36:25.994468 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.145954 4834 generic.go:334] "Generic (PLEG): container finished" podID="3321163d-92e3-4a02-8700-5d403f01f247" containerID="b0fbc3a50253a07c35c85086ae5abdb6aa02109b5d971a11d59f3916069cc340" exitCode=0 Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.146156 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-gbj7z" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerName="dnsmasq-dns" containerID="cri-o://476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4" gracePeriod=10 Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.146881 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" event={"ID":"3321163d-92e3-4a02-8700-5d403f01f247","Type":"ContainerDied","Data":"b0fbc3a50253a07c35c85086ae5abdb6aa02109b5d971a11d59f3916069cc340"} Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.146930 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" event={"ID":"3321163d-92e3-4a02-8700-5d403f01f247","Type":"ContainerStarted","Data":"754287b86e5c3b741e1b5e35b824c2efa2fe1138cbcb896e9eb0cdc3f52d218a"} Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.291499 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zxsg7"] Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.292615 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.299911 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.300128 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.300142 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.321245 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bv5d8"] Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.322742 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.328472 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zxsg7"] Jan 30 21:36:26 crc kubenswrapper[4834]: E0130 21:36:26.331855 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-kl9zn ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-kl9zn ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-zxsg7" podUID="a8e5580f-6518-4429-baee-cd9fd1597930" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.337626 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bv5d8"] Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.385910 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zxsg7"] Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462371 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-swiftconf\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462436 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-combined-ca-bundle\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462564 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-scripts\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462625 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-etc-swift\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-ring-data-devices\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-dispersionconf\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462701 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-ring-data-devices\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462875 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-swiftconf\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-scripts\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.462978 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:26 crc kubenswrapper[4834]: E0130 21:36:26.463095 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:36:26 crc kubenswrapper[4834]: E0130 21:36:26.463109 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.463104 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6vq\" (UniqueName: \"kubernetes.io/projected/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-kube-api-access-hz6vq\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: E0130 21:36:26.463146 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift podName:070baa9f-0897-4fe2-bc14-68a831d81dce nodeName:}" failed. No retries permitted until 2026-01-30 21:36:27.463133062 +0000 UTC m=+1238.616279200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift") pod "swift-storage-0" (UID: "070baa9f-0897-4fe2-bc14-68a831d81dce") : configmap "swift-ring-files" not found Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.463160 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-dispersionconf\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.463243 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a8e5580f-6518-4429-baee-cd9fd1597930-etc-swift\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.463298 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-combined-ca-bundle\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.463377 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9zn\" (UniqueName: \"kubernetes.io/projected/a8e5580f-6518-4429-baee-cd9fd1597930-kube-api-access-kl9zn\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.567658 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-swiftconf\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568011 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-combined-ca-bundle\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568056 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-scripts\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-etc-swift\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568108 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-ring-data-devices\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568133 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-dispersionconf\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568162 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-ring-data-devices\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568216 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-swiftconf\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568255 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-scripts\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568304 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz6vq\" (UniqueName: \"kubernetes.io/projected/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-kube-api-access-hz6vq\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568320 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-dispersionconf\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568353 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a8e5580f-6518-4429-baee-cd9fd1597930-etc-swift\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568379 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-combined-ca-bundle\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.568430 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9zn\" (UniqueName: \"kubernetes.io/projected/a8e5580f-6518-4429-baee-cd9fd1597930-kube-api-access-kl9zn\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.570049 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-ring-data-devices\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.571587 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a8e5580f-6518-4429-baee-cd9fd1597930-etc-swift\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.573052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-swiftconf\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.573675 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-scripts\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.574413 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-etc-swift\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.574983 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-scripts\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.575255 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-ring-data-devices\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.578439 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-dispersionconf\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.578750 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-dispersionconf\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.579446 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-combined-ca-bundle\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.580308 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-combined-ca-bundle\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.580785 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-swiftconf\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.585899 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz6vq\" (UniqueName: \"kubernetes.io/projected/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-kube-api-access-hz6vq\") pod \"swift-ring-rebalance-bv5d8\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.592711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9zn\" (UniqueName: \"kubernetes.io/projected/a8e5580f-6518-4429-baee-cd9fd1597930-kube-api-access-kl9zn\") pod \"swift-ring-rebalance-zxsg7\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.655053 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.665426 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.771641 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-config\") pod \"934f7f22-200f-4eca-8a12-dfca8511eb5c\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.771735 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-nb\") pod \"934f7f22-200f-4eca-8a12-dfca8511eb5c\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.771757 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-dns-svc\") pod \"934f7f22-200f-4eca-8a12-dfca8511eb5c\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.771813 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vnkg\" (UniqueName: \"kubernetes.io/projected/934f7f22-200f-4eca-8a12-dfca8511eb5c-kube-api-access-2vnkg\") pod \"934f7f22-200f-4eca-8a12-dfca8511eb5c\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.771853 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-sb\") pod \"934f7f22-200f-4eca-8a12-dfca8511eb5c\" (UID: \"934f7f22-200f-4eca-8a12-dfca8511eb5c\") " Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.778176 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934f7f22-200f-4eca-8a12-dfca8511eb5c-kube-api-access-2vnkg" (OuterVolumeSpecName: "kube-api-access-2vnkg") pod "934f7f22-200f-4eca-8a12-dfca8511eb5c" (UID: "934f7f22-200f-4eca-8a12-dfca8511eb5c"). InnerVolumeSpecName "kube-api-access-2vnkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.833005 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "934f7f22-200f-4eca-8a12-dfca8511eb5c" (UID: "934f7f22-200f-4eca-8a12-dfca8511eb5c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.838527 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "934f7f22-200f-4eca-8a12-dfca8511eb5c" (UID: "934f7f22-200f-4eca-8a12-dfca8511eb5c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.841861 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "934f7f22-200f-4eca-8a12-dfca8511eb5c" (UID: "934f7f22-200f-4eca-8a12-dfca8511eb5c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.846022 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-config" (OuterVolumeSpecName: "config") pod "934f7f22-200f-4eca-8a12-dfca8511eb5c" (UID: "934f7f22-200f-4eca-8a12-dfca8511eb5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.874803 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.874835 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.874846 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.874854 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vnkg\" (UniqueName: \"kubernetes.io/projected/934f7f22-200f-4eca-8a12-dfca8511eb5c-kube-api-access-2vnkg\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:26 crc kubenswrapper[4834]: I0130 21:36:26.874864 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/934f7f22-200f-4eca-8a12-dfca8511eb5c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.155291 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" event={"ID":"3321163d-92e3-4a02-8700-5d403f01f247","Type":"ContainerStarted","Data":"711184ec698d8d0bf2405d0db4475397b2b664ce9e5c476a30b3ce5b576ef5dc"} Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.155426 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.156898 4834 generic.go:334] "Generic (PLEG): container finished" podID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerID="476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4" exitCode=0 Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.156954 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-gbj7z" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.156973 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.156989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gbj7z" event={"ID":"934f7f22-200f-4eca-8a12-dfca8511eb5c","Type":"ContainerDied","Data":"476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4"} Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.157039 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-gbj7z" event={"ID":"934f7f22-200f-4eca-8a12-dfca8511eb5c","Type":"ContainerDied","Data":"65b2c7ff7ae725ddb6096af6c97cc36219e7d690482d0de906242640169c9bf6"} Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.157063 4834 scope.go:117] "RemoveContainer" containerID="476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.166297 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.182512 4834 scope.go:117] "RemoveContainer" containerID="9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.190303 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bv5d8"] Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.192347 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" podStartSLOduration=3.192328056 podStartE2EDuration="3.192328056s" podCreationTimestamp="2026-01-30 21:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:27.182678435 +0000 UTC m=+1238.335824573" watchObservedRunningTime="2026-01-30 21:36:27.192328056 +0000 UTC m=+1238.345474194" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.211435 4834 scope.go:117] "RemoveContainer" containerID="476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4" Jan 30 21:36:27 crc kubenswrapper[4834]: E0130 21:36:27.212910 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4\": container with ID starting with 476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4 not found: ID does not exist" containerID="476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.213031 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4"} err="failed to get container status \"476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4\": rpc error: code = NotFound desc = could not find container \"476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4\": container with ID starting with 476aefd8088832b99fee4041f027a0bceb0def17f3892aebf8d6008b23f15bf4 not found: ID does not exist" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.213058 4834 scope.go:117] "RemoveContainer" containerID="9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0" Jan 30 21:36:27 crc kubenswrapper[4834]: E0130 21:36:27.214039 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0\": container with ID starting with 9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0 not found: ID does not exist" containerID="9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.214087 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0"} err="failed to get container status \"9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0\": rpc error: code = NotFound desc = could not find container \"9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0\": container with ID starting with 9463b21de4de256694ee4a35bfcf26f21928d1a80d47fc41d445a890e50d90a0 not found: ID does not exist" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.221533 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gbj7z"] Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.227189 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-gbj7z"] Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281506 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-dispersionconf\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281566 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a8e5580f-6518-4429-baee-cd9fd1597930-etc-swift\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281623 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-scripts\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281764 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-ring-data-devices\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281804 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-swiftconf\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281841 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl9zn\" (UniqueName: \"kubernetes.io/projected/a8e5580f-6518-4429-baee-cd9fd1597930-kube-api-access-kl9zn\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.281868 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-combined-ca-bundle\") pod \"a8e5580f-6518-4429-baee-cd9fd1597930\" (UID: \"a8e5580f-6518-4429-baee-cd9fd1597930\") " Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.282036 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e5580f-6518-4429-baee-cd9fd1597930-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.282296 4834 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a8e5580f-6518-4429-baee-cd9fd1597930-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.282947 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-scripts" (OuterVolumeSpecName: "scripts") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.283511 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.286677 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.287256 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e5580f-6518-4429-baee-cd9fd1597930-kube-api-access-kl9zn" (OuterVolumeSpecName: "kube-api-access-kl9zn") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "kube-api-access-kl9zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.287798 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.288363 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a8e5580f-6518-4429-baee-cd9fd1597930" (UID: "a8e5580f-6518-4429-baee-cd9fd1597930"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.384592 4834 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.384666 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.384679 4834 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a8e5580f-6518-4429-baee-cd9fd1597930-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.384691 4834 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.384705 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl9zn\" (UniqueName: \"kubernetes.io/projected/a8e5580f-6518-4429-baee-cd9fd1597930-kube-api-access-kl9zn\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.384718 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8e5580f-6518-4429-baee-cd9fd1597930-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.485750 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:27 crc kubenswrapper[4834]: E0130 21:36:27.486008 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:36:27 crc kubenswrapper[4834]: E0130 21:36:27.486022 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:36:27 crc kubenswrapper[4834]: E0130 21:36:27.486062 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift podName:070baa9f-0897-4fe2-bc14-68a831d81dce nodeName:}" failed. No retries permitted until 2026-01-30 21:36:29.48604935 +0000 UTC m=+1240.639195488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift") pod "swift-storage-0" (UID: "070baa9f-0897-4fe2-bc14-68a831d81dce") : configmap "swift-ring-files" not found Jan 30 21:36:27 crc kubenswrapper[4834]: I0130 21:36:27.550106 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" path="/var/lib/kubelet/pods/934f7f22-200f-4eca-8a12-dfca8511eb5c/volumes" Jan 30 21:36:28 crc kubenswrapper[4834]: I0130 21:36:28.171017 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bv5d8" event={"ID":"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb","Type":"ContainerStarted","Data":"85a43e85094511c47230af8608e5b710a281702f1dc46bbac46fa3a3f041f4c4"} Jan 30 21:36:28 crc kubenswrapper[4834]: I0130 21:36:28.174620 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zxsg7" Jan 30 21:36:28 crc kubenswrapper[4834]: I0130 21:36:28.224524 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-zxsg7"] Jan 30 21:36:28 crc kubenswrapper[4834]: I0130 21:36:28.232557 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-zxsg7"] Jan 30 21:36:29 crc kubenswrapper[4834]: I0130 21:36:29.341835 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 30 21:36:29 crc kubenswrapper[4834]: I0130 21:36:29.443187 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 30 21:36:29 crc kubenswrapper[4834]: I0130 21:36:29.531061 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:29 crc kubenswrapper[4834]: E0130 21:36:29.532242 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:36:29 crc kubenswrapper[4834]: E0130 21:36:29.532590 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:36:29 crc kubenswrapper[4834]: E0130 21:36:29.533654 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift podName:070baa9f-0897-4fe2-bc14-68a831d81dce nodeName:}" failed. No retries permitted until 2026-01-30 21:36:33.533625767 +0000 UTC m=+1244.686771905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift") pod "swift-storage-0" (UID: "070baa9f-0897-4fe2-bc14-68a831d81dce") : configmap "swift-ring-files" not found Jan 30 21:36:29 crc kubenswrapper[4834]: I0130 21:36:29.551182 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e5580f-6518-4429-baee-cd9fd1597930" path="/var/lib/kubelet/pods/a8e5580f-6518-4429-baee-cd9fd1597930/volumes" Jan 30 21:36:30 crc kubenswrapper[4834]: I0130 21:36:30.994310 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 30 21:36:30 crc kubenswrapper[4834]: I0130 21:36:30.995855 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.080019 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.136318 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qnlg7"] Jan 30 21:36:31 crc kubenswrapper[4834]: E0130 21:36:31.136752 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerName="init" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.136765 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerName="init" Jan 30 21:36:31 crc kubenswrapper[4834]: E0130 21:36:31.136779 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerName="dnsmasq-dns" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.136786 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerName="dnsmasq-dns" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.136947 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="934f7f22-200f-4eca-8a12-dfca8511eb5c" containerName="dnsmasq-dns" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.137539 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.142836 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qnlg7"] Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.147318 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.202906 4834 generic.go:334] "Generic (PLEG): container finished" podID="bd8c97eb-154c-451c-88ec-025f6148936c" containerID="b5f40a5302db9286d8889928391704f4344288840c59da33e134a6c141d1f7cc" exitCode=0 Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.203079 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bd8c97eb-154c-451c-88ec-025f6148936c","Type":"ContainerDied","Data":"b5f40a5302db9286d8889928391704f4344288840c59da33e134a6c141d1f7cc"} Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.206819 4834 generic.go:334] "Generic (PLEG): container finished" podID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerID="298b0d54929de73c138aeb7f470a15fc04c1aa6fa18b6a5212045046eec7737f" exitCode=0 Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.206946 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"500f2414-6837-49ac-b834-06b5fd86d2b8","Type":"ContainerDied","Data":"298b0d54929de73c138aeb7f470a15fc04c1aa6fa18b6a5212045046eec7737f"} Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.272362 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651f733b-17b1-4837-93d5-f206e004062a-operator-scripts\") pod \"root-account-create-update-qnlg7\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.272521 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kpnk\" (UniqueName: \"kubernetes.io/projected/651f733b-17b1-4837-93d5-f206e004062a-kube-api-access-4kpnk\") pod \"root-account-create-update-qnlg7\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.299762 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.373612 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kpnk\" (UniqueName: \"kubernetes.io/projected/651f733b-17b1-4837-93d5-f206e004062a-kube-api-access-4kpnk\") pod \"root-account-create-update-qnlg7\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.373792 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651f733b-17b1-4837-93d5-f206e004062a-operator-scripts\") pod \"root-account-create-update-qnlg7\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.375125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651f733b-17b1-4837-93d5-f206e004062a-operator-scripts\") pod \"root-account-create-update-qnlg7\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.393997 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kpnk\" (UniqueName: \"kubernetes.io/projected/651f733b-17b1-4837-93d5-f206e004062a-kube-api-access-4kpnk\") pod \"root-account-create-update-qnlg7\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:31 crc kubenswrapper[4834]: I0130 21:36:31.474736 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.223736 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bv5d8" event={"ID":"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb","Type":"ContainerStarted","Data":"9a7bd76ab6a9c1be62662d3c6f473798ca2a952af365a74be3cd654948a354cf"} Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.227383 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"500f2414-6837-49ac-b834-06b5fd86d2b8","Type":"ContainerStarted","Data":"a2f57942c10e77446bfc01dc91bd3dc17789a4321895b0d223014e4622a2f767"} Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.227592 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.232341 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bd8c97eb-154c-451c-88ec-025f6148936c","Type":"ContainerStarted","Data":"1dfc18f738a42fea16bef16f66c063db2e73e7bc2d4fcf8e74f75bdef09ebcc8"} Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.233167 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.261246 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-tq9jt"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.262609 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.286002 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bv5d8" podStartSLOduration=1.56286182 podStartE2EDuration="6.28598142s" podCreationTimestamp="2026-01-30 21:36:26 +0000 UTC" firstStartedPulling="2026-01-30 21:36:27.211347501 +0000 UTC m=+1238.364493639" lastFinishedPulling="2026-01-30 21:36:31.934467101 +0000 UTC m=+1243.087613239" observedRunningTime="2026-01-30 21:36:32.257460758 +0000 UTC m=+1243.410606916" watchObservedRunningTime="2026-01-30 21:36:32.28598142 +0000 UTC m=+1243.439127578" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.299728 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tq9jt"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.319508 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.699416576 podStartE2EDuration="54.319489542s" podCreationTimestamp="2026-01-30 21:35:38 +0000 UTC" firstStartedPulling="2026-01-30 21:35:39.99716276 +0000 UTC m=+1191.150308898" lastFinishedPulling="2026-01-30 21:35:57.617235726 +0000 UTC m=+1208.770381864" observedRunningTime="2026-01-30 21:36:32.313843273 +0000 UTC m=+1243.466989421" watchObservedRunningTime="2026-01-30 21:36:32.319489542 +0000 UTC m=+1243.472635680" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.342072 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qnlg7"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.358343 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.160258398 podStartE2EDuration="54.358327213s" podCreationTimestamp="2026-01-30 21:35:38 +0000 UTC" firstStartedPulling="2026-01-30 21:35:40.372306684 +0000 UTC m=+1191.525452822" lastFinishedPulling="2026-01-30 21:35:57.570375479 +0000 UTC m=+1208.723521637" observedRunningTime="2026-01-30 21:36:32.35499849 +0000 UTC m=+1243.508144628" watchObservedRunningTime="2026-01-30 21:36:32.358327213 +0000 UTC m=+1243.511473351" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.391277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nll5j\" (UniqueName: \"kubernetes.io/projected/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-kube-api-access-nll5j\") pod \"keystone-db-create-tq9jt\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.392909 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-operator-scripts\") pod \"keystone-db-create-tq9jt\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.425262 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2b27-account-create-update-xn4jd"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.426226 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.428076 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.439430 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2b27-account-create-update-xn4jd"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.495290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b92e53-1808-4b80-9b1d-413bbb97933d-operator-scripts\") pod \"keystone-2b27-account-create-update-xn4jd\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.495334 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvh9q\" (UniqueName: \"kubernetes.io/projected/a6b92e53-1808-4b80-9b1d-413bbb97933d-kube-api-access-fvh9q\") pod \"keystone-2b27-account-create-update-xn4jd\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.495411 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-operator-scripts\") pod \"keystone-db-create-tq9jt\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.495512 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nll5j\" (UniqueName: \"kubernetes.io/projected/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-kube-api-access-nll5j\") pod \"keystone-db-create-tq9jt\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.496505 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-operator-scripts\") pod \"keystone-db-create-tq9jt\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.525508 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nll5j\" (UniqueName: \"kubernetes.io/projected/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-kube-api-access-nll5j\") pod \"keystone-db-create-tq9jt\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.583346 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.597565 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b92e53-1808-4b80-9b1d-413bbb97933d-operator-scripts\") pod \"keystone-2b27-account-create-update-xn4jd\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.597604 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvh9q\" (UniqueName: \"kubernetes.io/projected/a6b92e53-1808-4b80-9b1d-413bbb97933d-kube-api-access-fvh9q\") pod \"keystone-2b27-account-create-update-xn4jd\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.598373 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b92e53-1808-4b80-9b1d-413bbb97933d-operator-scripts\") pod \"keystone-2b27-account-create-update-xn4jd\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.625521 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5c0a-account-create-update-qgtjv"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.632526 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.648681 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvh9q\" (UniqueName: \"kubernetes.io/projected/a6b92e53-1808-4b80-9b1d-413bbb97933d-kube-api-access-fvh9q\") pod \"keystone-2b27-account-create-update-xn4jd\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.650938 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.657616 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xgmcw"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.660235 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.669181 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c0a-account-create-update-qgtjv"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.686246 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xgmcw"] Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.699194 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-operator-scripts\") pod \"placement-5c0a-account-create-update-qgtjv\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.699285 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqbc\" (UniqueName: \"kubernetes.io/projected/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-kube-api-access-qcqbc\") pod \"placement-5c0a-account-create-update-qgtjv\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.788983 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.800908 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-operator-scripts\") pod \"placement-5c0a-account-create-update-qgtjv\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.800981 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj58n\" (UniqueName: \"kubernetes.io/projected/39b1b391-8f59-474c-9842-adcfedf5cee3-kube-api-access-dj58n\") pod \"placement-db-create-xgmcw\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.801014 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39b1b391-8f59-474c-9842-adcfedf5cee3-operator-scripts\") pod \"placement-db-create-xgmcw\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.801052 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqbc\" (UniqueName: \"kubernetes.io/projected/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-kube-api-access-qcqbc\") pod \"placement-5c0a-account-create-update-qgtjv\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.802199 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-operator-scripts\") pod \"placement-5c0a-account-create-update-qgtjv\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.822193 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqbc\" (UniqueName: \"kubernetes.io/projected/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-kube-api-access-qcqbc\") pod \"placement-5c0a-account-create-update-qgtjv\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.903123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj58n\" (UniqueName: \"kubernetes.io/projected/39b1b391-8f59-474c-9842-adcfedf5cee3-kube-api-access-dj58n\") pod \"placement-db-create-xgmcw\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.903614 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39b1b391-8f59-474c-9842-adcfedf5cee3-operator-scripts\") pod \"placement-db-create-xgmcw\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.904839 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39b1b391-8f59-474c-9842-adcfedf5cee3-operator-scripts\") pod \"placement-db-create-xgmcw\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:32 crc kubenswrapper[4834]: I0130 21:36:32.938809 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj58n\" (UniqueName: \"kubernetes.io/projected/39b1b391-8f59-474c-9842-adcfedf5cee3-kube-api-access-dj58n\") pod \"placement-db-create-xgmcw\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.019491 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.030235 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.152710 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tq9jt"] Jan 30 21:36:33 crc kubenswrapper[4834]: W0130 21:36:33.182861 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd60c70e9_9d5e_42f4_ae4f_2dfe0549ec5d.slice/crio-ab0048717b7e92bb17e8bbec89fd0fd6ca706f4f11cbfc8134f7c3e5456e972d WatchSource:0}: Error finding container ab0048717b7e92bb17e8bbec89fd0fd6ca706f4f11cbfc8134f7c3e5456e972d: Status 404 returned error can't find the container with id ab0048717b7e92bb17e8bbec89fd0fd6ca706f4f11cbfc8134f7c3e5456e972d Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.268539 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tq9jt" event={"ID":"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d","Type":"ContainerStarted","Data":"ab0048717b7e92bb17e8bbec89fd0fd6ca706f4f11cbfc8134f7c3e5456e972d"} Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.276509 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qnlg7" event={"ID":"651f733b-17b1-4837-93d5-f206e004062a","Type":"ContainerStarted","Data":"8b6485eff1d46030a91f054fd46ed3228c039d232ba76177611ab9294bb8f54b"} Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.276780 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qnlg7" event={"ID":"651f733b-17b1-4837-93d5-f206e004062a","Type":"ContainerStarted","Data":"a25efdbf9c648f9f045251af6c82f5a7e8c6ab6320a2dab79d9ff978b031126c"} Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.317636 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-qnlg7" podStartSLOduration=2.317614554 podStartE2EDuration="2.317614554s" podCreationTimestamp="2026-01-30 21:36:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:33.296363557 +0000 UTC m=+1244.449509705" watchObservedRunningTime="2026-01-30 21:36:33.317614554 +0000 UTC m=+1244.470760692" Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.391058 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2b27-account-create-update-xn4jd"] Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.632835 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:33 crc kubenswrapper[4834]: E0130 21:36:33.633053 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 30 21:36:33 crc kubenswrapper[4834]: E0130 21:36:33.633076 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 30 21:36:33 crc kubenswrapper[4834]: E0130 21:36:33.633151 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift podName:070baa9f-0897-4fe2-bc14-68a831d81dce nodeName:}" failed. No retries permitted until 2026-01-30 21:36:41.633132621 +0000 UTC m=+1252.786278759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift") pod "swift-storage-0" (UID: "070baa9f-0897-4fe2-bc14-68a831d81dce") : configmap "swift-ring-files" not found Jan 30 21:36:33 crc kubenswrapper[4834]: W0130 21:36:33.687380 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32c5a9d6_cb42_4851_b02d_9e8664c51bf4.slice/crio-e0bcb6af47a02d0c23894e424f7ca71efde450ce5f62f2f7cfd47a74f974cee2 WatchSource:0}: Error finding container e0bcb6af47a02d0c23894e424f7ca71efde450ce5f62f2f7cfd47a74f974cee2: Status 404 returned error can't find the container with id e0bcb6af47a02d0c23894e424f7ca71efde450ce5f62f2f7cfd47a74f974cee2 Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.695964 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5c0a-account-create-update-qgtjv"] Jan 30 21:36:33 crc kubenswrapper[4834]: I0130 21:36:33.825292 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xgmcw"] Jan 30 21:36:33 crc kubenswrapper[4834]: W0130 21:36:33.827247 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39b1b391_8f59_474c_9842_adcfedf5cee3.slice/crio-1961f249004cd57a92957e8d2393bd2c8b534e54af26abeaae2a94cd2cf62780 WatchSource:0}: Error finding container 1961f249004cd57a92957e8d2393bd2c8b534e54af26abeaae2a94cd2cf62780: Status 404 returned error can't find the container with id 1961f249004cd57a92957e8d2393bd2c8b534e54af26abeaae2a94cd2cf62780 Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.285148 4834 generic.go:334] "Generic (PLEG): container finished" podID="651f733b-17b1-4837-93d5-f206e004062a" containerID="8b6485eff1d46030a91f054fd46ed3228c039d232ba76177611ab9294bb8f54b" exitCode=0 Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.285220 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qnlg7" event={"ID":"651f733b-17b1-4837-93d5-f206e004062a","Type":"ContainerDied","Data":"8b6485eff1d46030a91f054fd46ed3228c039d232ba76177611ab9294bb8f54b"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.287014 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xgmcw" event={"ID":"39b1b391-8f59-474c-9842-adcfedf5cee3","Type":"ContainerStarted","Data":"dd9046c1575a0c00f39dc89856f4f2885d66221155294a2de23d47b858b8d27a"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.287049 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xgmcw" event={"ID":"39b1b391-8f59-474c-9842-adcfedf5cee3","Type":"ContainerStarted","Data":"1961f249004cd57a92957e8d2393bd2c8b534e54af26abeaae2a94cd2cf62780"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.289474 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c0a-account-create-update-qgtjv" event={"ID":"32c5a9d6-cb42-4851-b02d-9e8664c51bf4","Type":"ContainerStarted","Data":"82263ce65f875c23e68e26efe3d10bb03d87f2c848a17824194d30cca02217e2"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.289527 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c0a-account-create-update-qgtjv" event={"ID":"32c5a9d6-cb42-4851-b02d-9e8664c51bf4","Type":"ContainerStarted","Data":"e0bcb6af47a02d0c23894e424f7ca71efde450ce5f62f2f7cfd47a74f974cee2"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.290986 4834 generic.go:334] "Generic (PLEG): container finished" podID="d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" containerID="f475e99c2fbcc941b45c3a49f489cccc1546c7a1922802c751012d73f2226352" exitCode=0 Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.291068 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tq9jt" event={"ID":"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d","Type":"ContainerDied","Data":"f475e99c2fbcc941b45c3a49f489cccc1546c7a1922802c751012d73f2226352"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.294330 4834 generic.go:334] "Generic (PLEG): container finished" podID="a6b92e53-1808-4b80-9b1d-413bbb97933d" containerID="c7c583795bb10848d52ffc70341935de3c4d68870b1e18ad5998254ca101f43d" exitCode=0 Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.294372 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2b27-account-create-update-xn4jd" event={"ID":"a6b92e53-1808-4b80-9b1d-413bbb97933d","Type":"ContainerDied","Data":"c7c583795bb10848d52ffc70341935de3c4d68870b1e18ad5998254ca101f43d"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.294577 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2b27-account-create-update-xn4jd" event={"ID":"a6b92e53-1808-4b80-9b1d-413bbb97933d","Type":"ContainerStarted","Data":"56e4f8e2183a7685bffe0175af80adddba533e0bf90ac8964da368be05533f1a"} Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.355517 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5c0a-account-create-update-qgtjv" podStartSLOduration=2.355499104 podStartE2EDuration="2.355499104s" podCreationTimestamp="2026-01-30 21:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:34.350813732 +0000 UTC m=+1245.503959870" watchObservedRunningTime="2026-01-30 21:36:34.355499104 +0000 UTC m=+1245.508645252" Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.404144 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-xgmcw" podStartSLOduration=2.40412378 podStartE2EDuration="2.40412378s" podCreationTimestamp="2026-01-30 21:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:34.385952879 +0000 UTC m=+1245.539099027" watchObservedRunningTime="2026-01-30 21:36:34.40412378 +0000 UTC m=+1245.557269918" Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.928514 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:36:34 crc kubenswrapper[4834]: I0130 21:36:34.999175 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ftq2c"] Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.000143 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerName="dnsmasq-dns" containerID="cri-o://c2d9592aade1d345238f6fcf85da09172b99d3c4c632ae835ea8a0d0009d742b" gracePeriod=10 Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.323222 4834 generic.go:334] "Generic (PLEG): container finished" podID="32c5a9d6-cb42-4851-b02d-9e8664c51bf4" containerID="82263ce65f875c23e68e26efe3d10bb03d87f2c848a17824194d30cca02217e2" exitCode=0 Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.323280 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c0a-account-create-update-qgtjv" event={"ID":"32c5a9d6-cb42-4851-b02d-9e8664c51bf4","Type":"ContainerDied","Data":"82263ce65f875c23e68e26efe3d10bb03d87f2c848a17824194d30cca02217e2"} Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.324905 4834 generic.go:334] "Generic (PLEG): container finished" podID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerID="c2d9592aade1d345238f6fcf85da09172b99d3c4c632ae835ea8a0d0009d742b" exitCode=0 Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.324946 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" event={"ID":"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad","Type":"ContainerDied","Data":"c2d9592aade1d345238f6fcf85da09172b99d3c4c632ae835ea8a0d0009d742b"} Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.328857 4834 generic.go:334] "Generic (PLEG): container finished" podID="39b1b391-8f59-474c-9842-adcfedf5cee3" containerID="dd9046c1575a0c00f39dc89856f4f2885d66221155294a2de23d47b858b8d27a" exitCode=0 Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.329019 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xgmcw" event={"ID":"39b1b391-8f59-474c-9842-adcfedf5cee3","Type":"ContainerDied","Data":"dd9046c1575a0c00f39dc89856f4f2885d66221155294a2de23d47b858b8d27a"} Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.613029 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.668791 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxscg\" (UniqueName: \"kubernetes.io/projected/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-kube-api-access-cxscg\") pod \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.668870 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-dns-svc\") pod \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.668960 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-config\") pod \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\" (UID: \"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.692449 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-kube-api-access-cxscg" (OuterVolumeSpecName: "kube-api-access-cxscg") pod "58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" (UID: "58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad"). InnerVolumeSpecName "kube-api-access-cxscg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.747989 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" (UID: "58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.751030 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-config" (OuterVolumeSpecName: "config") pod "58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" (UID: "58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.771758 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.771804 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxscg\" (UniqueName: \"kubernetes.io/projected/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-kube-api-access-cxscg\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.771820 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.804697 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.873746 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nll5j\" (UniqueName: \"kubernetes.io/projected/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-kube-api-access-nll5j\") pod \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.874043 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-operator-scripts\") pod \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\" (UID: \"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.874735 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" (UID: "d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.874907 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.882564 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-kube-api-access-nll5j" (OuterVolumeSpecName: "kube-api-access-nll5j") pod "d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" (UID: "d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d"). InnerVolumeSpecName "kube-api-access-nll5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.977897 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kpnk\" (UniqueName: \"kubernetes.io/projected/651f733b-17b1-4837-93d5-f206e004062a-kube-api-access-4kpnk\") pod \"651f733b-17b1-4837-93d5-f206e004062a\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.978095 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651f733b-17b1-4837-93d5-f206e004062a-operator-scripts\") pod \"651f733b-17b1-4837-93d5-f206e004062a\" (UID: \"651f733b-17b1-4837-93d5-f206e004062a\") " Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.978610 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.978627 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nll5j\" (UniqueName: \"kubernetes.io/projected/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d-kube-api-access-nll5j\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.978962 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651f733b-17b1-4837-93d5-f206e004062a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "651f733b-17b1-4837-93d5-f206e004062a" (UID: "651f733b-17b1-4837-93d5-f206e004062a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:35 crc kubenswrapper[4834]: I0130 21:36:35.984643 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651f733b-17b1-4837-93d5-f206e004062a-kube-api-access-4kpnk" (OuterVolumeSpecName: "kube-api-access-4kpnk") pod "651f733b-17b1-4837-93d5-f206e004062a" (UID: "651f733b-17b1-4837-93d5-f206e004062a"). InnerVolumeSpecName "kube-api-access-4kpnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.063976 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.080134 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvh9q\" (UniqueName: \"kubernetes.io/projected/a6b92e53-1808-4b80-9b1d-413bbb97933d-kube-api-access-fvh9q\") pod \"a6b92e53-1808-4b80-9b1d-413bbb97933d\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.080368 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b92e53-1808-4b80-9b1d-413bbb97933d-operator-scripts\") pod \"a6b92e53-1808-4b80-9b1d-413bbb97933d\" (UID: \"a6b92e53-1808-4b80-9b1d-413bbb97933d\") " Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.080720 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b92e53-1808-4b80-9b1d-413bbb97933d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6b92e53-1808-4b80-9b1d-413bbb97933d" (UID: "a6b92e53-1808-4b80-9b1d-413bbb97933d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.080813 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651f733b-17b1-4837-93d5-f206e004062a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.080829 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6b92e53-1808-4b80-9b1d-413bbb97933d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.080838 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kpnk\" (UniqueName: \"kubernetes.io/projected/651f733b-17b1-4837-93d5-f206e004062a-kube-api-access-4kpnk\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.082810 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b92e53-1808-4b80-9b1d-413bbb97933d-kube-api-access-fvh9q" (OuterVolumeSpecName: "kube-api-access-fvh9q") pod "a6b92e53-1808-4b80-9b1d-413bbb97933d" (UID: "a6b92e53-1808-4b80-9b1d-413bbb97933d"). InnerVolumeSpecName "kube-api-access-fvh9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.182200 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvh9q\" (UniqueName: \"kubernetes.io/projected/a6b92e53-1808-4b80-9b1d-413bbb97933d-kube-api-access-fvh9q\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.337551 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tq9jt" event={"ID":"d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d","Type":"ContainerDied","Data":"ab0048717b7e92bb17e8bbec89fd0fd6ca706f4f11cbfc8134f7c3e5456e972d"} Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.337603 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab0048717b7e92bb17e8bbec89fd0fd6ca706f4f11cbfc8134f7c3e5456e972d" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.337578 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tq9jt" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.339514 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2b27-account-create-update-xn4jd" event={"ID":"a6b92e53-1808-4b80-9b1d-413bbb97933d","Type":"ContainerDied","Data":"56e4f8e2183a7685bffe0175af80adddba533e0bf90ac8964da368be05533f1a"} Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.339555 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e4f8e2183a7685bffe0175af80adddba533e0bf90ac8964da368be05533f1a" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.339563 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2b27-account-create-update-xn4jd" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.341615 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" event={"ID":"58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad","Type":"ContainerDied","Data":"ad17df70671c9f68b40fb4820143098ea225552c29d5b923941235eb7aa5d56f"} Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.341653 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ftq2c" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.341662 4834 scope.go:117] "RemoveContainer" containerID="c2d9592aade1d345238f6fcf85da09172b99d3c4c632ae835ea8a0d0009d742b" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.343535 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qnlg7" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.343565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qnlg7" event={"ID":"651f733b-17b1-4837-93d5-f206e004062a","Type":"ContainerDied","Data":"a25efdbf9c648f9f045251af6c82f5a7e8c6ab6320a2dab79d9ff978b031126c"} Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.343581 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a25efdbf9c648f9f045251af6c82f5a7e8c6ab6320a2dab79d9ff978b031126c" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.368247 4834 scope.go:117] "RemoveContainer" containerID="3c39b55fbf0eea3af69632e44d754a0c579796a31ec2924ec52cf0165ab3875e" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.383936 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ftq2c"] Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.391945 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ftq2c"] Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.771888 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.795806 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39b1b391-8f59-474c-9842-adcfedf5cee3-operator-scripts\") pod \"39b1b391-8f59-474c-9842-adcfedf5cee3\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.796126 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj58n\" (UniqueName: \"kubernetes.io/projected/39b1b391-8f59-474c-9842-adcfedf5cee3-kube-api-access-dj58n\") pod \"39b1b391-8f59-474c-9842-adcfedf5cee3\" (UID: \"39b1b391-8f59-474c-9842-adcfedf5cee3\") " Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.796836 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b1b391-8f59-474c-9842-adcfedf5cee3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39b1b391-8f59-474c-9842-adcfedf5cee3" (UID: "39b1b391-8f59-474c-9842-adcfedf5cee3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.801510 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b1b391-8f59-474c-9842-adcfedf5cee3-kube-api-access-dj58n" (OuterVolumeSpecName: "kube-api-access-dj58n") pod "39b1b391-8f59-474c-9842-adcfedf5cee3" (UID: "39b1b391-8f59-474c-9842-adcfedf5cee3"). InnerVolumeSpecName "kube-api-access-dj58n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.878139 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.898773 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39b1b391-8f59-474c-9842-adcfedf5cee3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.898804 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj58n\" (UniqueName: \"kubernetes.io/projected/39b1b391-8f59-474c-9842-adcfedf5cee3-kube-api-access-dj58n\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:36 crc kubenswrapper[4834]: I0130 21:36:36.999910 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-operator-scripts\") pod \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:36.999987 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcqbc\" (UniqueName: \"kubernetes.io/projected/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-kube-api-access-qcqbc\") pod \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\" (UID: \"32c5a9d6-cb42-4851-b02d-9e8664c51bf4\") " Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.000697 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32c5a9d6-cb42-4851-b02d-9e8664c51bf4" (UID: "32c5a9d6-cb42-4851-b02d-9e8664c51bf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.002886 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-kube-api-access-qcqbc" (OuterVolumeSpecName: "kube-api-access-qcqbc") pod "32c5a9d6-cb42-4851-b02d-9e8664c51bf4" (UID: "32c5a9d6-cb42-4851-b02d-9e8664c51bf4"). InnerVolumeSpecName "kube-api-access-qcqbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.101260 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.101297 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcqbc\" (UniqueName: \"kubernetes.io/projected/32c5a9d6-cb42-4851-b02d-9e8664c51bf4-kube-api-access-qcqbc\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.356193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xgmcw" event={"ID":"39b1b391-8f59-474c-9842-adcfedf5cee3","Type":"ContainerDied","Data":"1961f249004cd57a92957e8d2393bd2c8b534e54af26abeaae2a94cd2cf62780"} Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.356238 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1961f249004cd57a92957e8d2393bd2c8b534e54af26abeaae2a94cd2cf62780" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.356207 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xgmcw" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.358131 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5c0a-account-create-update-qgtjv" event={"ID":"32c5a9d6-cb42-4851-b02d-9e8664c51bf4","Type":"ContainerDied","Data":"e0bcb6af47a02d0c23894e424f7ca71efde450ce5f62f2f7cfd47a74f974cee2"} Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.358171 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0bcb6af47a02d0c23894e424f7ca71efde450ce5f62f2f7cfd47a74f974cee2" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.358169 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5c0a-account-create-update-qgtjv" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.553430 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" path="/var/lib/kubelet/pods/58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad/volumes" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.890718 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6g4tw"] Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891095 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerName="init" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891115 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerName="init" Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891135 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" containerName="mariadb-database-create" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891144 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" containerName="mariadb-database-create" Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891165 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b1b391-8f59-474c-9842-adcfedf5cee3" containerName="mariadb-database-create" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891174 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b1b391-8f59-474c-9842-adcfedf5cee3" containerName="mariadb-database-create" Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891195 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerName="dnsmasq-dns" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891203 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerName="dnsmasq-dns" Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891212 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b92e53-1808-4b80-9b1d-413bbb97933d" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891220 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b92e53-1808-4b80-9b1d-413bbb97933d" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891238 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651f733b-17b1-4837-93d5-f206e004062a" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891246 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="651f733b-17b1-4837-93d5-f206e004062a" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: E0130 21:36:37.891260 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c5a9d6-cb42-4851-b02d-9e8664c51bf4" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891268 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c5a9d6-cb42-4851-b02d-9e8664c51bf4" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891466 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="651f733b-17b1-4837-93d5-f206e004062a" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891481 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="58dc8ec9-b8e2-4039-b2f3-f30c2706e3ad" containerName="dnsmasq-dns" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891488 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b1b391-8f59-474c-9842-adcfedf5cee3" containerName="mariadb-database-create" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891500 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c5a9d6-cb42-4851-b02d-9e8664c51bf4" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891518 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b92e53-1808-4b80-9b1d-413bbb97933d" containerName="mariadb-account-create-update" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.891537 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" containerName="mariadb-database-create" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.892203 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.902643 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6g4tw"] Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.915537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dndbx\" (UniqueName: \"kubernetes.io/projected/c984ce30-d436-4965-83dd-3d07ec99edde-kube-api-access-dndbx\") pod \"glance-db-create-6g4tw\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.916019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c984ce30-d436-4965-83dd-3d07ec99edde-operator-scripts\") pod \"glance-db-create-6g4tw\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.993113 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-747b-account-create-update-g5s5w"] Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.994432 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:37 crc kubenswrapper[4834]: I0130 21:36:37.997467 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.017585 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks69c\" (UniqueName: \"kubernetes.io/projected/1e0d2ef6-0a27-40ca-89d8-6508620ef387-kube-api-access-ks69c\") pod \"glance-747b-account-create-update-g5s5w\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.017667 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c984ce30-d436-4965-83dd-3d07ec99edde-operator-scripts\") pod \"glance-db-create-6g4tw\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.017822 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dndbx\" (UniqueName: \"kubernetes.io/projected/c984ce30-d436-4965-83dd-3d07ec99edde-kube-api-access-dndbx\") pod \"glance-db-create-6g4tw\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.017981 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0d2ef6-0a27-40ca-89d8-6508620ef387-operator-scripts\") pod \"glance-747b-account-create-update-g5s5w\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.018602 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c984ce30-d436-4965-83dd-3d07ec99edde-operator-scripts\") pod \"glance-db-create-6g4tw\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.026591 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-747b-account-create-update-g5s5w"] Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.052087 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dndbx\" (UniqueName: \"kubernetes.io/projected/c984ce30-d436-4965-83dd-3d07ec99edde-kube-api-access-dndbx\") pod \"glance-db-create-6g4tw\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.119947 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks69c\" (UniqueName: \"kubernetes.io/projected/1e0d2ef6-0a27-40ca-89d8-6508620ef387-kube-api-access-ks69c\") pod \"glance-747b-account-create-update-g5s5w\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.120365 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0d2ef6-0a27-40ca-89d8-6508620ef387-operator-scripts\") pod \"glance-747b-account-create-update-g5s5w\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.121084 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0d2ef6-0a27-40ca-89d8-6508620ef387-operator-scripts\") pod \"glance-747b-account-create-update-g5s5w\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.144051 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks69c\" (UniqueName: \"kubernetes.io/projected/1e0d2ef6-0a27-40ca-89d8-6508620ef387-kube-api-access-ks69c\") pod \"glance-747b-account-create-update-g5s5w\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.209142 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.307804 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.748878 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6g4tw"] Jan 30 21:36:38 crc kubenswrapper[4834]: I0130 21:36:38.930816 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-747b-account-create-update-g5s5w"] Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.378588 4834 generic.go:334] "Generic (PLEG): container finished" podID="1e0d2ef6-0a27-40ca-89d8-6508620ef387" containerID="2ebefbca99ac1cd2493ef0753e3f52ad8e469a4ce01115609906e4ceec0d4f2e" exitCode=0 Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.378696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-747b-account-create-update-g5s5w" event={"ID":"1e0d2ef6-0a27-40ca-89d8-6508620ef387","Type":"ContainerDied","Data":"2ebefbca99ac1cd2493ef0753e3f52ad8e469a4ce01115609906e4ceec0d4f2e"} Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.378927 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-747b-account-create-update-g5s5w" event={"ID":"1e0d2ef6-0a27-40ca-89d8-6508620ef387","Type":"ContainerStarted","Data":"e97b5ef8cd3e574ca3138c4714fa96888523d8f672ba014324727507819c0f8b"} Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.380856 4834 generic.go:334] "Generic (PLEG): container finished" podID="c984ce30-d436-4965-83dd-3d07ec99edde" containerID="b980abe12aafe5d38e5f8768127d2ff0d6968da531333e1578c7114dfa76b10d" exitCode=0 Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.380922 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6g4tw" event={"ID":"c984ce30-d436-4965-83dd-3d07ec99edde","Type":"ContainerDied","Data":"b980abe12aafe5d38e5f8768127d2ff0d6968da531333e1578c7114dfa76b10d"} Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.380955 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6g4tw" event={"ID":"c984ce30-d436-4965-83dd-3d07ec99edde","Type":"ContainerStarted","Data":"ea7324ffc7aca3f699844f13996414aee4941a9cc7f32c6fdab4edb151f13bab"} Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.607241 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qnlg7"] Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.619905 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qnlg7"] Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.692152 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pwvjc"] Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.693272 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.695293 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.743319 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pwvjc"] Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.759888 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/567cead5-a09c-4eee-b8ab-ebd2619b1565-operator-scripts\") pod \"root-account-create-update-pwvjc\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.760064 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4fjv\" (UniqueName: \"kubernetes.io/projected/567cead5-a09c-4eee-b8ab-ebd2619b1565-kube-api-access-z4fjv\") pod \"root-account-create-update-pwvjc\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.861996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4fjv\" (UniqueName: \"kubernetes.io/projected/567cead5-a09c-4eee-b8ab-ebd2619b1565-kube-api-access-z4fjv\") pod \"root-account-create-update-pwvjc\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.862087 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/567cead5-a09c-4eee-b8ab-ebd2619b1565-operator-scripts\") pod \"root-account-create-update-pwvjc\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.863005 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/567cead5-a09c-4eee-b8ab-ebd2619b1565-operator-scripts\") pod \"root-account-create-update-pwvjc\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:39 crc kubenswrapper[4834]: I0130 21:36:39.881677 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4fjv\" (UniqueName: \"kubernetes.io/projected/567cead5-a09c-4eee-b8ab-ebd2619b1565-kube-api-access-z4fjv\") pod \"root-account-create-update-pwvjc\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.014651 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.395333 4834 generic.go:334] "Generic (PLEG): container finished" podID="9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" containerID="9a7bd76ab6a9c1be62662d3c6f473798ca2a952af365a74be3cd654948a354cf" exitCode=0 Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.395447 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bv5d8" event={"ID":"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb","Type":"ContainerDied","Data":"9a7bd76ab6a9c1be62662d3c6f473798ca2a952af365a74be3cd654948a354cf"} Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.504874 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pwvjc"] Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.740729 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.787821 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0d2ef6-0a27-40ca-89d8-6508620ef387-operator-scripts\") pod \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.788075 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks69c\" (UniqueName: \"kubernetes.io/projected/1e0d2ef6-0a27-40ca-89d8-6508620ef387-kube-api-access-ks69c\") pod \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\" (UID: \"1e0d2ef6-0a27-40ca-89d8-6508620ef387\") " Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.789532 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e0d2ef6-0a27-40ca-89d8-6508620ef387-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e0d2ef6-0a27-40ca-89d8-6508620ef387" (UID: "1e0d2ef6-0a27-40ca-89d8-6508620ef387"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.793636 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e0d2ef6-0a27-40ca-89d8-6508620ef387-kube-api-access-ks69c" (OuterVolumeSpecName: "kube-api-access-ks69c") pod "1e0d2ef6-0a27-40ca-89d8-6508620ef387" (UID: "1e0d2ef6-0a27-40ca-89d8-6508620ef387"). InnerVolumeSpecName "kube-api-access-ks69c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.809958 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.890317 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c984ce30-d436-4965-83dd-3d07ec99edde-operator-scripts\") pod \"c984ce30-d436-4965-83dd-3d07ec99edde\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.890362 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dndbx\" (UniqueName: \"kubernetes.io/projected/c984ce30-d436-4965-83dd-3d07ec99edde-kube-api-access-dndbx\") pod \"c984ce30-d436-4965-83dd-3d07ec99edde\" (UID: \"c984ce30-d436-4965-83dd-3d07ec99edde\") " Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.890793 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks69c\" (UniqueName: \"kubernetes.io/projected/1e0d2ef6-0a27-40ca-89d8-6508620ef387-kube-api-access-ks69c\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.890810 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e0d2ef6-0a27-40ca-89d8-6508620ef387-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.890854 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c984ce30-d436-4965-83dd-3d07ec99edde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c984ce30-d436-4965-83dd-3d07ec99edde" (UID: "c984ce30-d436-4965-83dd-3d07ec99edde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.894965 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c984ce30-d436-4965-83dd-3d07ec99edde-kube-api-access-dndbx" (OuterVolumeSpecName: "kube-api-access-dndbx") pod "c984ce30-d436-4965-83dd-3d07ec99edde" (UID: "c984ce30-d436-4965-83dd-3d07ec99edde"). InnerVolumeSpecName "kube-api-access-dndbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.992739 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c984ce30-d436-4965-83dd-3d07ec99edde-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:40 crc kubenswrapper[4834]: I0130 21:36:40.992970 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dndbx\" (UniqueName: \"kubernetes.io/projected/c984ce30-d436-4965-83dd-3d07ec99edde-kube-api-access-dndbx\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.013406 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.405429 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-747b-account-create-update-g5s5w" event={"ID":"1e0d2ef6-0a27-40ca-89d8-6508620ef387","Type":"ContainerDied","Data":"e97b5ef8cd3e574ca3138c4714fa96888523d8f672ba014324727507819c0f8b"} Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.405506 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e97b5ef8cd3e574ca3138c4714fa96888523d8f672ba014324727507819c0f8b" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.405482 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-747b-account-create-update-g5s5w" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.407204 4834 generic.go:334] "Generic (PLEG): container finished" podID="567cead5-a09c-4eee-b8ab-ebd2619b1565" containerID="30b429785ae07574307d2ee24ed2e4655ab0e90616b6dabdba9cd230fe9ee263" exitCode=0 Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.407292 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwvjc" event={"ID":"567cead5-a09c-4eee-b8ab-ebd2619b1565","Type":"ContainerDied","Data":"30b429785ae07574307d2ee24ed2e4655ab0e90616b6dabdba9cd230fe9ee263"} Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.407340 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwvjc" event={"ID":"567cead5-a09c-4eee-b8ab-ebd2619b1565","Type":"ContainerStarted","Data":"9fa0aa1e213583edb19037dd1b965d085c9952107b47b9cfb5b114fe784d86da"} Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.408977 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6g4tw" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.408989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6g4tw" event={"ID":"c984ce30-d436-4965-83dd-3d07ec99edde","Type":"ContainerDied","Data":"ea7324ffc7aca3f699844f13996414aee4941a9cc7f32c6fdab4edb151f13bab"} Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.409028 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7324ffc7aca3f699844f13996414aee4941a9cc7f32c6fdab4edb151f13bab" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.553898 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651f733b-17b1-4837-93d5-f206e004062a" path="/var/lib/kubelet/pods/651f733b-17b1-4837-93d5-f206e004062a/volumes" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.707263 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.722849 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/070baa9f-0897-4fe2-bc14-68a831d81dce-etc-swift\") pod \"swift-storage-0\" (UID: \"070baa9f-0897-4fe2-bc14-68a831d81dce\") " pod="openstack/swift-storage-0" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.754798 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.829070 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.910191 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-dispersionconf\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.910535 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-scripts\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.910626 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-combined-ca-bundle\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.910759 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-etc-swift\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.911002 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-ring-data-devices\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.911105 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-swiftconf\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.911236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz6vq\" (UniqueName: \"kubernetes.io/projected/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-kube-api-access-hz6vq\") pod \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\" (UID: \"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb\") " Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.911412 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.911793 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.912020 4834 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.912100 4834 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.922436 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-kube-api-access-hz6vq" (OuterVolumeSpecName: "kube-api-access-hz6vq") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "kube-api-access-hz6vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.932594 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:41 crc kubenswrapper[4834]: I0130 21:36:41.987516 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.001211 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.019633 4834 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.019656 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz6vq\" (UniqueName: \"kubernetes.io/projected/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-kube-api-access-hz6vq\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.019666 4834 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.019674 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.092364 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-scripts" (OuterVolumeSpecName: "scripts") pod "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" (UID: "9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.121380 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.419673 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bv5d8" Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.419659 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bv5d8" event={"ID":"9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb","Type":"ContainerDied","Data":"85a43e85094511c47230af8608e5b710a281702f1dc46bbac46fa3a3f041f4c4"} Jan 30 21:36:42 crc kubenswrapper[4834]: I0130 21:36:42.420191 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85a43e85094511c47230af8608e5b710a281702f1dc46bbac46fa3a3f041f4c4" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.165141 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mdpvh"] Jan 30 21:36:43 crc kubenswrapper[4834]: E0130 21:36:43.165818 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c984ce30-d436-4965-83dd-3d07ec99edde" containerName="mariadb-database-create" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.165833 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c984ce30-d436-4965-83dd-3d07ec99edde" containerName="mariadb-database-create" Jan 30 21:36:43 crc kubenswrapper[4834]: E0130 21:36:43.165844 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e0d2ef6-0a27-40ca-89d8-6508620ef387" containerName="mariadb-account-create-update" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.165850 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e0d2ef6-0a27-40ca-89d8-6508620ef387" containerName="mariadb-account-create-update" Jan 30 21:36:43 crc kubenswrapper[4834]: E0130 21:36:43.165864 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" containerName="swift-ring-rebalance" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.165871 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" containerName="swift-ring-rebalance" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.166028 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb" containerName="swift-ring-rebalance" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.166048 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e0d2ef6-0a27-40ca-89d8-6508620ef387" containerName="mariadb-account-create-update" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.166058 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c984ce30-d436-4965-83dd-3d07ec99edde" containerName="mariadb-database-create" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.166750 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.169036 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nk74b" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.169141 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.176582 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mdpvh"] Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.243683 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-config-data\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.243743 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqwp\" (UniqueName: \"kubernetes.io/projected/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-kube-api-access-4lqwp\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.243952 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-db-sync-config-data\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.244070 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-combined-ca-bundle\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.322414 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.340075 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.345836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-db-sync-config-data\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.345888 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-combined-ca-bundle\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.345950 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-config-data\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.345991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqwp\" (UniqueName: \"kubernetes.io/projected/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-kube-api-access-4lqwp\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.352052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-db-sync-config-data\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.353441 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-config-data\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.353972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-combined-ca-bundle\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.372439 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t2k4l" podUID="493ce910-9c99-49f5-85eb-3917715c87b6" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:36:43 crc kubenswrapper[4834]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:36:43 crc kubenswrapper[4834]: > Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.379906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqwp\" (UniqueName: \"kubernetes.io/projected/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-kube-api-access-4lqwp\") pod \"glance-db-sync-mdpvh\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.413779 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.441857 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"468d02a3c7dc21efd469a3217431f4a4cb1aa69ab802c9cce42d00d49567fb2d"} Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.447624 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4fjv\" (UniqueName: \"kubernetes.io/projected/567cead5-a09c-4eee-b8ab-ebd2619b1565-kube-api-access-z4fjv\") pod \"567cead5-a09c-4eee-b8ab-ebd2619b1565\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.447883 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pwvjc" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.448063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pwvjc" event={"ID":"567cead5-a09c-4eee-b8ab-ebd2619b1565","Type":"ContainerDied","Data":"9fa0aa1e213583edb19037dd1b965d085c9952107b47b9cfb5b114fe784d86da"} Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.448129 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa0aa1e213583edb19037dd1b965d085c9952107b47b9cfb5b114fe784d86da" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.447896 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/567cead5-a09c-4eee-b8ab-ebd2619b1565-operator-scripts\") pod \"567cead5-a09c-4eee-b8ab-ebd2619b1565\" (UID: \"567cead5-a09c-4eee-b8ab-ebd2619b1565\") " Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.448614 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567cead5-a09c-4eee-b8ab-ebd2619b1565-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "567cead5-a09c-4eee-b8ab-ebd2619b1565" (UID: "567cead5-a09c-4eee-b8ab-ebd2619b1565"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.451483 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/567cead5-a09c-4eee-b8ab-ebd2619b1565-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.454622 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567cead5-a09c-4eee-b8ab-ebd2619b1565-kube-api-access-z4fjv" (OuterVolumeSpecName: "kube-api-access-z4fjv") pod "567cead5-a09c-4eee-b8ab-ebd2619b1565" (UID: "567cead5-a09c-4eee-b8ab-ebd2619b1565"). InnerVolumeSpecName "kube-api-access-z4fjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.492609 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mdpvh" Jan 30 21:36:43 crc kubenswrapper[4834]: I0130 21:36:43.553308 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4fjv\" (UniqueName: \"kubernetes.io/projected/567cead5-a09c-4eee-b8ab-ebd2619b1565-kube-api-access-z4fjv\") on node \"crc\" DevicePath \"\"" Jan 30 21:36:44 crc kubenswrapper[4834]: W0130 21:36:44.079102 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb88d2fad_923e_4f77_b8f2_d8be1858e1ff.slice/crio-12943535aca210ae749f92e185f6ac931fc209698b2c829a6ea64e0e252ca899 WatchSource:0}: Error finding container 12943535aca210ae749f92e185f6ac931fc209698b2c829a6ea64e0e252ca899: Status 404 returned error can't find the container with id 12943535aca210ae749f92e185f6ac931fc209698b2c829a6ea64e0e252ca899 Jan 30 21:36:44 crc kubenswrapper[4834]: I0130 21:36:44.083809 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mdpvh"] Jan 30 21:36:44 crc kubenswrapper[4834]: I0130 21:36:44.456460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mdpvh" event={"ID":"b88d2fad-923e-4f77-b8f2-d8be1858e1ff","Type":"ContainerStarted","Data":"12943535aca210ae749f92e185f6ac931fc209698b2c829a6ea64e0e252ca899"} Jan 30 21:36:45 crc kubenswrapper[4834]: I0130 21:36:45.464565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"363d2e11a1e1e5bdfdd543e1e4ff160fe399a08764f7b707890aa6665f1a4c62"} Jan 30 21:36:46 crc kubenswrapper[4834]: I0130 21:36:46.165918 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pwvjc"] Jan 30 21:36:46 crc kubenswrapper[4834]: I0130 21:36:46.172957 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pwvjc"] Jan 30 21:36:46 crc kubenswrapper[4834]: I0130 21:36:46.477173 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"6329120cdf1f2dde17e765f1939781c5bc240473e02c8623a7eb2dd62dc6780a"} Jan 30 21:36:46 crc kubenswrapper[4834]: I0130 21:36:46.477280 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"0d8a17d66390c6653b58d19923aaa94034494e0e4ea17bea8b2f5b643ad8030e"} Jan 30 21:36:46 crc kubenswrapper[4834]: I0130 21:36:46.477294 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"b87539d811f1e641b1ebd87f463b487f8ce3531619808c7c34bc93fe28c5ceca"} Jan 30 21:36:47 crc kubenswrapper[4834]: I0130 21:36:47.547298 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567cead5-a09c-4eee-b8ab-ebd2619b1565" path="/var/lib/kubelet/pods/567cead5-a09c-4eee-b8ab-ebd2619b1565/volumes" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.363891 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t2k4l" podUID="493ce910-9c99-49f5-85eb-3917715c87b6" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:36:48 crc kubenswrapper[4834]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:36:48 crc kubenswrapper[4834]: > Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.421248 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qxr8t" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.495712 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"49668a78b9ed9b21afcefb7c4ebaa4a59412d10c7711af976bb6dab8fc756656"} Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.495963 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"1eb05899695e7e4dbe1845dd5e6538b2b6b51e688127358a50d908793b134fb9"} Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.496060 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"37f9159e01f940431de972424bfd22f4cdc996e1f7cd3e533657bd3b81e45617"} Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.496138 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"5a692f081b7e6b57c3b6230c07f8f1b3bf9332a796d47e90cd77774dec342b59"} Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.625221 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t2k4l-config-wxkd5"] Jan 30 21:36:48 crc kubenswrapper[4834]: E0130 21:36:48.625965 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567cead5-a09c-4eee-b8ab-ebd2619b1565" containerName="mariadb-account-create-update" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.625985 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="567cead5-a09c-4eee-b8ab-ebd2619b1565" containerName="mariadb-account-create-update" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.626195 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="567cead5-a09c-4eee-b8ab-ebd2619b1565" containerName="mariadb-account-create-update" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.627019 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.629281 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.640523 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t2k4l-config-wxkd5"] Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.759998 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn4h4\" (UniqueName: \"kubernetes.io/projected/a69828df-e6d0-4555-8c25-0e7669ad0269-kube-api-access-xn4h4\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.760145 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-log-ovn\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.760525 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-scripts\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.760642 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-additional-scripts\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.760819 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run-ovn\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.760862 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862414 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862479 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn4h4\" (UniqueName: \"kubernetes.io/projected/a69828df-e6d0-4555-8c25-0e7669ad0269-kube-api-access-xn4h4\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-log-ovn\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862626 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-scripts\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-additional-scripts\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862738 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run-ovn\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-log-ovn\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.862861 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run-ovn\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.863770 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-additional-scripts\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.864878 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-scripts\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.887248 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn4h4\" (UniqueName: \"kubernetes.io/projected/a69828df-e6d0-4555-8c25-0e7669ad0269-kube-api-access-xn4h4\") pod \"ovn-controller-t2k4l-config-wxkd5\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:48 crc kubenswrapper[4834]: I0130 21:36:48.956607 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.460850 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.815639 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.829523 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6vxnz"] Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.830575 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.843854 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6vxnz"] Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.906425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkr7\" (UniqueName: \"kubernetes.io/projected/63bd7fa6-422c-44c1-8d65-569d82a716f1-kube-api-access-9nkr7\") pod \"cinder-db-create-6vxnz\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.906734 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63bd7fa6-422c-44c1-8d65-569d82a716f1-operator-scripts\") pod \"cinder-db-create-6vxnz\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.942523 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rhwh6"] Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.943740 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:49 crc kubenswrapper[4834]: I0130 21:36:49.973690 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rhwh6"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:49.999195 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5d0f-account-create-update-wjvkc"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.000146 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.004110 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.016712 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5d0f-account-create-update-wjvkc"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.020481 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkr7\" (UniqueName: \"kubernetes.io/projected/63bd7fa6-422c-44c1-8d65-569d82a716f1-kube-api-access-9nkr7\") pod \"cinder-db-create-6vxnz\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.020516 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63bd7fa6-422c-44c1-8d65-569d82a716f1-operator-scripts\") pod \"cinder-db-create-6vxnz\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.020577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b295ca4-d536-4367-92e2-746945500e9a-operator-scripts\") pod \"barbican-db-create-rhwh6\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.021432 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63bd7fa6-422c-44c1-8d65-569d82a716f1-operator-scripts\") pod \"cinder-db-create-6vxnz\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.023818 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcnm\" (UniqueName: \"kubernetes.io/projected/8b295ca4-d536-4367-92e2-746945500e9a-kube-api-access-hxcnm\") pod \"barbican-db-create-rhwh6\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.059780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkr7\" (UniqueName: \"kubernetes.io/projected/63bd7fa6-422c-44c1-8d65-569d82a716f1-kube-api-access-9nkr7\") pod \"cinder-db-create-6vxnz\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.112016 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2700-account-create-update-kqx8q"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.120988 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.125010 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b295ca4-d536-4367-92e2-746945500e9a-operator-scripts\") pod \"barbican-db-create-rhwh6\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.125100 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcnm\" (UniqueName: \"kubernetes.io/projected/8b295ca4-d536-4367-92e2-746945500e9a-kube-api-access-hxcnm\") pod \"barbican-db-create-rhwh6\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.125135 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-operator-scripts\") pod \"cinder-5d0f-account-create-update-wjvkc\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.125175 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zpdv\" (UniqueName: \"kubernetes.io/projected/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-kube-api-access-4zpdv\") pod \"cinder-5d0f-account-create-update-wjvkc\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.125322 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.126296 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b295ca4-d536-4367-92e2-746945500e9a-operator-scripts\") pod \"barbican-db-create-rhwh6\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.133483 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2700-account-create-update-kqx8q"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.165635 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcnm\" (UniqueName: \"kubernetes.io/projected/8b295ca4-d536-4367-92e2-746945500e9a-kube-api-access-hxcnm\") pod \"barbican-db-create-rhwh6\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.226898 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-operator-scripts\") pod \"cinder-5d0f-account-create-update-wjvkc\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.227185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krkmp\" (UniqueName: \"kubernetes.io/projected/06bf42dd-19ce-4bef-af7c-4777a04fbdca-kube-api-access-krkmp\") pod \"barbican-2700-account-create-update-kqx8q\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.227226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bf42dd-19ce-4bef-af7c-4777a04fbdca-operator-scripts\") pod \"barbican-2700-account-create-update-kqx8q\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.227245 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zpdv\" (UniqueName: \"kubernetes.io/projected/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-kube-api-access-4zpdv\") pod \"cinder-5d0f-account-create-update-wjvkc\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.228214 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-operator-scripts\") pod \"cinder-5d0f-account-create-update-wjvkc\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.255603 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6vxnz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.269500 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zpdv\" (UniqueName: \"kubernetes.io/projected/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-kube-api-access-4zpdv\") pod \"cinder-5d0f-account-create-update-wjvkc\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.278983 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5xww2"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.280314 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.284939 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhwh6" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.294814 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5xww2"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.320354 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dfde-account-create-update-m4mzz"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.321750 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.330081 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.330467 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.331585 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krkmp\" (UniqueName: \"kubernetes.io/projected/06bf42dd-19ce-4bef-af7c-4777a04fbdca-kube-api-access-krkmp\") pod \"barbican-2700-account-create-update-kqx8q\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.331629 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bf42dd-19ce-4bef-af7c-4777a04fbdca-operator-scripts\") pod \"barbican-2700-account-create-update-kqx8q\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.331664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12040694-1a05-486f-a71d-eb6940d21985-operator-scripts\") pod \"neutron-db-create-5xww2\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.331716 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz5ck\" (UniqueName: \"kubernetes.io/projected/12040694-1a05-486f-a71d-eb6940d21985-kube-api-access-fz5ck\") pod \"neutron-db-create-5xww2\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.336164 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bf42dd-19ce-4bef-af7c-4777a04fbdca-operator-scripts\") pod \"barbican-2700-account-create-update-kqx8q\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.373210 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dfde-account-create-update-m4mzz"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.374258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krkmp\" (UniqueName: \"kubernetes.io/projected/06bf42dd-19ce-4bef-af7c-4777a04fbdca-kube-api-access-krkmp\") pod \"barbican-2700-account-create-update-kqx8q\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.381882 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zct6g"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.383131 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.387415 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.387615 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.387631 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.387780 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzg4z" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.394982 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zct6g"] Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.432951 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12040694-1a05-486f-a71d-eb6940d21985-operator-scripts\") pod \"neutron-db-create-5xww2\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.432999 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4dwk\" (UniqueName: \"kubernetes.io/projected/37293f9c-d144-4553-ab7c-04caf3e28d18-kube-api-access-n4dwk\") pod \"neutron-dfde-account-create-update-m4mzz\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.433063 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkbj\" (UniqueName: \"kubernetes.io/projected/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-kube-api-access-lqkbj\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.433103 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz5ck\" (UniqueName: \"kubernetes.io/projected/12040694-1a05-486f-a71d-eb6940d21985-kube-api-access-fz5ck\") pod \"neutron-db-create-5xww2\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.433136 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37293f9c-d144-4553-ab7c-04caf3e28d18-operator-scripts\") pod \"neutron-dfde-account-create-update-m4mzz\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.433202 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-config-data\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.433603 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-combined-ca-bundle\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.433931 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12040694-1a05-486f-a71d-eb6940d21985-operator-scripts\") pod \"neutron-db-create-5xww2\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.440657 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.452641 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz5ck\" (UniqueName: \"kubernetes.io/projected/12040694-1a05-486f-a71d-eb6940d21985-kube-api-access-fz5ck\") pod \"neutron-db-create-5xww2\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.535653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-combined-ca-bundle\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.535711 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4dwk\" (UniqueName: \"kubernetes.io/projected/37293f9c-d144-4553-ab7c-04caf3e28d18-kube-api-access-n4dwk\") pod \"neutron-dfde-account-create-update-m4mzz\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.535754 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkbj\" (UniqueName: \"kubernetes.io/projected/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-kube-api-access-lqkbj\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.535800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37293f9c-d144-4553-ab7c-04caf3e28d18-operator-scripts\") pod \"neutron-dfde-account-create-update-m4mzz\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.535851 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-config-data\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.537182 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37293f9c-d144-4553-ab7c-04caf3e28d18-operator-scripts\") pod \"neutron-dfde-account-create-update-m4mzz\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.542183 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-combined-ca-bundle\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.545014 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-config-data\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.554631 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkbj\" (UniqueName: \"kubernetes.io/projected/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-kube-api-access-lqkbj\") pod \"keystone-db-sync-zct6g\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.554760 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4dwk\" (UniqueName: \"kubernetes.io/projected/37293f9c-d144-4553-ab7c-04caf3e28d18-kube-api-access-n4dwk\") pod \"neutron-dfde-account-create-update-m4mzz\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.611504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5xww2" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.645901 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:36:50 crc kubenswrapper[4834]: I0130 21:36:50.702502 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zct6g" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.450804 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rrsl9"] Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.452167 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.454576 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.479009 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rrsl9"] Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.552148 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-operator-scripts\") pod \"root-account-create-update-rrsl9\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.552211 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllzp\" (UniqueName: \"kubernetes.io/projected/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-kube-api-access-zllzp\") pod \"root-account-create-update-rrsl9\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.653841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-operator-scripts\") pod \"root-account-create-update-rrsl9\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.654140 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllzp\" (UniqueName: \"kubernetes.io/projected/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-kube-api-access-zllzp\") pod \"root-account-create-update-rrsl9\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.655772 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-operator-scripts\") pod \"root-account-create-update-rrsl9\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.685240 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllzp\" (UniqueName: \"kubernetes.io/projected/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-kube-api-access-zllzp\") pod \"root-account-create-update-rrsl9\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:51 crc kubenswrapper[4834]: I0130 21:36:51.793149 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrsl9" Jan 30 21:36:53 crc kubenswrapper[4834]: I0130 21:36:53.355196 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t2k4l" podUID="493ce910-9c99-49f5-85eb-3917715c87b6" containerName="ovn-controller" probeResult="failure" output=< Jan 30 21:36:53 crc kubenswrapper[4834]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 30 21:36:53 crc kubenswrapper[4834]: > Jan 30 21:36:56 crc kubenswrapper[4834]: I0130 21:36:56.708078 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dfde-account-create-update-m4mzz"] Jan 30 21:36:56 crc kubenswrapper[4834]: I0130 21:36:56.919359 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6vxnz"] Jan 30 21:36:56 crc kubenswrapper[4834]: I0130 21:36:56.926724 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rhwh6"] Jan 30 21:36:56 crc kubenswrapper[4834]: I0130 21:36:56.934040 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zct6g"] Jan 30 21:36:57 crc kubenswrapper[4834]: W0130 21:36:57.006267 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95a073f1_b8e0_4365_ae24_c4e2351a7f0e.slice/crio-4e13efd2f75fab401466db43a4a39f4e6dcb89c4e6101435f04e350251a8025d WatchSource:0}: Error finding container 4e13efd2f75fab401466db43a4a39f4e6dcb89c4e6101435f04e350251a8025d: Status 404 returned error can't find the container with id 4e13efd2f75fab401466db43a4a39f4e6dcb89c4e6101435f04e350251a8025d Jan 30 21:36:57 crc kubenswrapper[4834]: W0130 21:36:57.010754 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b295ca4_d536_4367_92e2_746945500e9a.slice/crio-899cd0d4a5fe2bc7db2b77d105ae7fc7ad2946d1f0c4e8df8aca94e161909118 WatchSource:0}: Error finding container 899cd0d4a5fe2bc7db2b77d105ae7fc7ad2946d1f0c4e8df8aca94e161909118: Status 404 returned error can't find the container with id 899cd0d4a5fe2bc7db2b77d105ae7fc7ad2946d1f0c4e8df8aca94e161909118 Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.106344 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5xww2"] Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.123590 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5d0f-account-create-update-wjvkc"] Jan 30 21:36:57 crc kubenswrapper[4834]: W0130 21:36:57.128064 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda69828df_e6d0_4555_8c25_0e7669ad0269.slice/crio-cc44f5a1277deb61857e63216432ee9e87c475cc2162b9ecc803ca258e954e9c WatchSource:0}: Error finding container cc44f5a1277deb61857e63216432ee9e87c475cc2162b9ecc803ca258e954e9c: Status 404 returned error can't find the container with id cc44f5a1277deb61857e63216432ee9e87c475cc2162b9ecc803ca258e954e9c Jan 30 21:36:57 crc kubenswrapper[4834]: W0130 21:36:57.130539 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12040694_1a05_486f_a71d_eb6940d21985.slice/crio-89d4844a3b6e18dd1fd2f138d675daa4e0afd09c87272f2bd582f4ad6d7ac91a WatchSource:0}: Error finding container 89d4844a3b6e18dd1fd2f138d675daa4e0afd09c87272f2bd582f4ad6d7ac91a: Status 404 returned error can't find the container with id 89d4844a3b6e18dd1fd2f138d675daa4e0afd09c87272f2bd582f4ad6d7ac91a Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.130659 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t2k4l-config-wxkd5"] Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.137282 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rrsl9"] Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.144259 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2700-account-create-update-kqx8q"] Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.624338 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t2k4l-config-wxkd5" event={"ID":"a69828df-e6d0-4555-8c25-0e7669ad0269","Type":"ContainerStarted","Data":"ff8dc2bf48bde71be8dd7f9e99850cad657cbb62d67123905a28d5641c81cae9"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.624722 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t2k4l-config-wxkd5" event={"ID":"a69828df-e6d0-4555-8c25-0e7669ad0269","Type":"ContainerStarted","Data":"cc44f5a1277deb61857e63216432ee9e87c475cc2162b9ecc803ca258e954e9c"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.638765 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrsl9" event={"ID":"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7","Type":"ContainerStarted","Data":"1ec86ae8e8fc32f2fcb9d9b7ed35ee2b2dce03287356b82f42e482fec1b3914c"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.638801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrsl9" event={"ID":"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7","Type":"ContainerStarted","Data":"db9e8b75b12909b2465fd79aa0f212f5c2c54caedf765ace8a6388c8134eca20"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.642279 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-t2k4l-config-wxkd5" podStartSLOduration=9.642263011 podStartE2EDuration="9.642263011s" podCreationTimestamp="2026-01-30 21:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.639311708 +0000 UTC m=+1268.792457846" watchObservedRunningTime="2026-01-30 21:36:57.642263011 +0000 UTC m=+1268.795409149" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.646091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfde-account-create-update-m4mzz" event={"ID":"37293f9c-d144-4553-ab7c-04caf3e28d18","Type":"ContainerStarted","Data":"7a253c696c1b6a1eab036d4bb5186d4d347b74150d2d271133bcc8cd9ad5c3fd"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.646135 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfde-account-create-update-m4mzz" event={"ID":"37293f9c-d144-4553-ab7c-04caf3e28d18","Type":"ContainerStarted","Data":"40b77c1eee2cd175511a676a85b60182e02f867e6e9c220080ef77daffc323b2"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.659365 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-rrsl9" podStartSLOduration=6.659347271 podStartE2EDuration="6.659347271s" podCreationTimestamp="2026-01-30 21:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.65574193 +0000 UTC m=+1268.808888068" watchObservedRunningTime="2026-01-30 21:36:57.659347271 +0000 UTC m=+1268.812493409" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.663419 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"1506dc4d6bf1a64f57c050c988ea0fb5d481c40bc12b07db9ea72d5fb5cbcd92"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.685605 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dfde-account-create-update-m4mzz" podStartSLOduration=7.685584068 podStartE2EDuration="7.685584068s" podCreationTimestamp="2026-01-30 21:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.681101953 +0000 UTC m=+1268.834248091" watchObservedRunningTime="2026-01-30 21:36:57.685584068 +0000 UTC m=+1268.838730206" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.686799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhwh6" event={"ID":"8b295ca4-d536-4367-92e2-746945500e9a","Type":"ContainerStarted","Data":"7aaff3e86e401febc7b069870846ac8dfd8d8de37d3d393ac04f337241eb64fc"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.686856 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhwh6" event={"ID":"8b295ca4-d536-4367-92e2-746945500e9a","Type":"ContainerStarted","Data":"899cd0d4a5fe2bc7db2b77d105ae7fc7ad2946d1f0c4e8df8aca94e161909118"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.696162 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6vxnz" event={"ID":"63bd7fa6-422c-44c1-8d65-569d82a716f1","Type":"ContainerStarted","Data":"bdfeb13905025438bfd0e6ac00ee5aadc6e9d440ae4f7227c957f5c0741fcb56"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.696523 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6vxnz" event={"ID":"63bd7fa6-422c-44c1-8d65-569d82a716f1","Type":"ContainerStarted","Data":"032da939f22daf31d1e4bbb95c0577ee5c93ff4a0e83ea6a7140bc4d1506ce04"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.703415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mdpvh" event={"ID":"b88d2fad-923e-4f77-b8f2-d8be1858e1ff","Type":"ContainerStarted","Data":"b296177a63a222d735441c2738ca905fc4aa4146d067cda310e73d329e189780"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.709364 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-rhwh6" podStartSLOduration=8.709344736 podStartE2EDuration="8.709344736s" podCreationTimestamp="2026-01-30 21:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.705712524 +0000 UTC m=+1268.858858662" watchObservedRunningTime="2026-01-30 21:36:57.709344736 +0000 UTC m=+1268.862490874" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.723010 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d0f-account-create-update-wjvkc" event={"ID":"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4","Type":"ContainerStarted","Data":"b5e18427c6c4628454ea795bb4f87a226674af23856a6b2f67e8979bafdc21d9"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.723056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d0f-account-create-update-wjvkc" event={"ID":"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4","Type":"ContainerStarted","Data":"60b5e16e07fd44f14a88f71bab5914d174456be8c38c20f2abc99240c2b2e8ef"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.723746 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-6vxnz" podStartSLOduration=8.72371335 podStartE2EDuration="8.72371335s" podCreationTimestamp="2026-01-30 21:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.720816389 +0000 UTC m=+1268.873962527" watchObservedRunningTime="2026-01-30 21:36:57.72371335 +0000 UTC m=+1268.876859488" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.725192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5xww2" event={"ID":"12040694-1a05-486f-a71d-eb6940d21985","Type":"ContainerStarted","Data":"751bcc3a49f6dccdd019572068545ab1cf7d6e190f1668a8133345d583109042"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.725226 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5xww2" event={"ID":"12040694-1a05-486f-a71d-eb6940d21985","Type":"ContainerStarted","Data":"89d4844a3b6e18dd1fd2f138d675daa4e0afd09c87272f2bd582f4ad6d7ac91a"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.743550 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2700-account-create-update-kqx8q" event={"ID":"06bf42dd-19ce-4bef-af7c-4777a04fbdca","Type":"ContainerStarted","Data":"1fe311a630beda5bd922a6a19cb7669d9ce74d490be2e624cc44505c6a89359f"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.743621 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2700-account-create-update-kqx8q" event={"ID":"06bf42dd-19ce-4bef-af7c-4777a04fbdca","Type":"ContainerStarted","Data":"b7370f47e823951cdd8ae76b8c933bcd300dd30c94c0ca8846ca305b120e67d3"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.744978 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mdpvh" podStartSLOduration=2.449975671 podStartE2EDuration="14.744961757s" podCreationTimestamp="2026-01-30 21:36:43 +0000 UTC" firstStartedPulling="2026-01-30 21:36:44.081362576 +0000 UTC m=+1255.234508714" lastFinishedPulling="2026-01-30 21:36:56.376348662 +0000 UTC m=+1267.529494800" observedRunningTime="2026-01-30 21:36:57.738590648 +0000 UTC m=+1268.891736786" watchObservedRunningTime="2026-01-30 21:36:57.744961757 +0000 UTC m=+1268.898107895" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.745856 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zct6g" event={"ID":"95a073f1-b8e0-4365-ae24-c4e2351a7f0e","Type":"ContainerStarted","Data":"4e13efd2f75fab401466db43a4a39f4e6dcb89c4e6101435f04e350251a8025d"} Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.769078 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-5xww2" podStartSLOduration=7.769061405 podStartE2EDuration="7.769061405s" podCreationTimestamp="2026-01-30 21:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.752964632 +0000 UTC m=+1268.906110770" watchObservedRunningTime="2026-01-30 21:36:57.769061405 +0000 UTC m=+1268.922207543" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.782652 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-5d0f-account-create-update-wjvkc" podStartSLOduration=8.782635356 podStartE2EDuration="8.782635356s" podCreationTimestamp="2026-01-30 21:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.766848172 +0000 UTC m=+1268.919994310" watchObservedRunningTime="2026-01-30 21:36:57.782635356 +0000 UTC m=+1268.935781494" Jan 30 21:36:57 crc kubenswrapper[4834]: I0130 21:36:57.786752 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-2700-account-create-update-kqx8q" podStartSLOduration=7.786742942 podStartE2EDuration="7.786742942s" podCreationTimestamp="2026-01-30 21:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:36:57.779478767 +0000 UTC m=+1268.932624905" watchObservedRunningTime="2026-01-30 21:36:57.786742942 +0000 UTC m=+1268.939889080" Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.429877 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-t2k4l" Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.754677 4834 generic.go:334] "Generic (PLEG): container finished" podID="37293f9c-d144-4553-ab7c-04caf3e28d18" containerID="7a253c696c1b6a1eab036d4bb5186d4d347b74150d2d271133bcc8cd9ad5c3fd" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.754753 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfde-account-create-update-m4mzz" event={"ID":"37293f9c-d144-4553-ab7c-04caf3e28d18","Type":"ContainerDied","Data":"7a253c696c1b6a1eab036d4bb5186d4d347b74150d2d271133bcc8cd9ad5c3fd"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.756404 4834 generic.go:334] "Generic (PLEG): container finished" podID="8b295ca4-d536-4367-92e2-746945500e9a" containerID="7aaff3e86e401febc7b069870846ac8dfd8d8de37d3d393ac04f337241eb64fc" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.756466 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhwh6" event={"ID":"8b295ca4-d536-4367-92e2-746945500e9a","Type":"ContainerDied","Data":"7aaff3e86e401febc7b069870846ac8dfd8d8de37d3d393ac04f337241eb64fc"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.758942 4834 generic.go:334] "Generic (PLEG): container finished" podID="83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" containerID="1ec86ae8e8fc32f2fcb9d9b7ed35ee2b2dce03287356b82f42e482fec1b3914c" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.759032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrsl9" event={"ID":"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7","Type":"ContainerDied","Data":"1ec86ae8e8fc32f2fcb9d9b7ed35ee2b2dce03287356b82f42e482fec1b3914c"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.761016 4834 generic.go:334] "Generic (PLEG): container finished" podID="12040694-1a05-486f-a71d-eb6940d21985" containerID="751bcc3a49f6dccdd019572068545ab1cf7d6e190f1668a8133345d583109042" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.761076 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5xww2" event={"ID":"12040694-1a05-486f-a71d-eb6940d21985","Type":"ContainerDied","Data":"751bcc3a49f6dccdd019572068545ab1cf7d6e190f1668a8133345d583109042"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.770708 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"e25a780dff8899572adfd9e0f16ccc9c5b3ecaa32c6e5f5ff5121bdf022cfb6a"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.770739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"e72fa36c49c68a1f156792f84f935157ebf6639d948ecbf50872f924b097ba20"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.770748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"e17c812535ee6addf61d89dbc88475e8bf191676391d282d8180ba3e98f80968"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.772212 4834 generic.go:334] "Generic (PLEG): container finished" podID="06bf42dd-19ce-4bef-af7c-4777a04fbdca" containerID="1fe311a630beda5bd922a6a19cb7669d9ce74d490be2e624cc44505c6a89359f" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.772258 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2700-account-create-update-kqx8q" event={"ID":"06bf42dd-19ce-4bef-af7c-4777a04fbdca","Type":"ContainerDied","Data":"1fe311a630beda5bd922a6a19cb7669d9ce74d490be2e624cc44505c6a89359f"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.777037 4834 generic.go:334] "Generic (PLEG): container finished" podID="63bd7fa6-422c-44c1-8d65-569d82a716f1" containerID="bdfeb13905025438bfd0e6ac00ee5aadc6e9d440ae4f7227c957f5c0741fcb56" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.777112 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6vxnz" event={"ID":"63bd7fa6-422c-44c1-8d65-569d82a716f1","Type":"ContainerDied","Data":"bdfeb13905025438bfd0e6ac00ee5aadc6e9d440ae4f7227c957f5c0741fcb56"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.779968 4834 generic.go:334] "Generic (PLEG): container finished" podID="a69828df-e6d0-4555-8c25-0e7669ad0269" containerID="ff8dc2bf48bde71be8dd7f9e99850cad657cbb62d67123905a28d5641c81cae9" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.780021 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t2k4l-config-wxkd5" event={"ID":"a69828df-e6d0-4555-8c25-0e7669ad0269","Type":"ContainerDied","Data":"ff8dc2bf48bde71be8dd7f9e99850cad657cbb62d67123905a28d5641c81cae9"} Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.792660 4834 generic.go:334] "Generic (PLEG): container finished" podID="4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" containerID="b5e18427c6c4628454ea795bb4f87a226674af23856a6b2f67e8979bafdc21d9" exitCode=0 Jan 30 21:36:58 crc kubenswrapper[4834]: I0130 21:36:58.793038 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d0f-account-create-update-wjvkc" event={"ID":"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4","Type":"ContainerDied","Data":"b5e18427c6c4628454ea795bb4f87a226674af23856a6b2f67e8979bafdc21d9"} Jan 30 21:36:59 crc kubenswrapper[4834]: I0130 21:36:59.806447 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"64c0f50b0cff810183e6b8e459db3fa48d0004195bfc29ccbbcf15a51387b513"} Jan 30 21:36:59 crc kubenswrapper[4834]: I0130 21:36:59.806756 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"c93318dca8974bc51b99ec2b43e7034b55a6785408cf12b49436443714fb8984"} Jan 30 21:36:59 crc kubenswrapper[4834]: I0130 21:36:59.806768 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"070baa9f-0897-4fe2-bc14-68a831d81dce","Type":"ContainerStarted","Data":"dc240140985a49c9dd13166f46f787574c313d76ad5f590f6043023e1a38e40b"} Jan 30 21:36:59 crc kubenswrapper[4834]: I0130 21:36:59.869573 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.10146169 podStartE2EDuration="35.869556328s" podCreationTimestamp="2026-01-30 21:36:24 +0000 UTC" firstStartedPulling="2026-01-30 21:36:43.370774155 +0000 UTC m=+1254.523920293" lastFinishedPulling="2026-01-30 21:36:57.138868783 +0000 UTC m=+1268.292014931" observedRunningTime="2026-01-30 21:36:59.861817651 +0000 UTC m=+1271.014963789" watchObservedRunningTime="2026-01-30 21:36:59.869556328 +0000 UTC m=+1271.022702466" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.198024 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-4d6vx"] Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.201811 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.205790 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.217702 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-4d6vx"] Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.261999 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gpkm\" (UniqueName: \"kubernetes.io/projected/0524d426-e238-417b-a49d-1b2ea6e37e01-kube-api-access-2gpkm\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.262051 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.262094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.262131 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.262165 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.262213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-config\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.363300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-config\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.363444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gpkm\" (UniqueName: \"kubernetes.io/projected/0524d426-e238-417b-a49d-1b2ea6e37e01-kube-api-access-2gpkm\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.363465 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.363492 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.363520 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.363546 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.365891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-config\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.366496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-svc\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.366713 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-sb\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.367488 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-swift-storage-0\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.367622 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-nb\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.433025 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gpkm\" (UniqueName: \"kubernetes.io/projected/0524d426-e238-417b-a49d-1b2ea6e37e01-kube-api-access-2gpkm\") pod \"dnsmasq-dns-6d5b6d6b67-4d6vx\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:00 crc kubenswrapper[4834]: I0130 21:37:00.533972 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.245586 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.250528 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhwh6" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.285384 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.288056 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6vxnz" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.295163 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5xww2" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.301429 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrsl9" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302450 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run\") pod \"a69828df-e6d0-4555-8c25-0e7669ad0269\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302516 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-scripts\") pod \"a69828df-e6d0-4555-8c25-0e7669ad0269\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302554 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b295ca4-d536-4367-92e2-746945500e9a-operator-scripts\") pod \"8b295ca4-d536-4367-92e2-746945500e9a\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302610 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run" (OuterVolumeSpecName: "var-run") pod "a69828df-e6d0-4555-8c25-0e7669ad0269" (UID: "a69828df-e6d0-4555-8c25-0e7669ad0269"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302650 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxcnm\" (UniqueName: \"kubernetes.io/projected/8b295ca4-d536-4367-92e2-746945500e9a-kube-api-access-hxcnm\") pod \"8b295ca4-d536-4367-92e2-746945500e9a\" (UID: \"8b295ca4-d536-4367-92e2-746945500e9a\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302722 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-additional-scripts\") pod \"a69828df-e6d0-4555-8c25-0e7669ad0269\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302838 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn4h4\" (UniqueName: \"kubernetes.io/projected/a69828df-e6d0-4555-8c25-0e7669ad0269-kube-api-access-xn4h4\") pod \"a69828df-e6d0-4555-8c25-0e7669ad0269\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302893 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-log-ovn\") pod \"a69828df-e6d0-4555-8c25-0e7669ad0269\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.302917 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run-ovn\") pod \"a69828df-e6d0-4555-8c25-0e7669ad0269\" (UID: \"a69828df-e6d0-4555-8c25-0e7669ad0269\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.303431 4834 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.303484 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a69828df-e6d0-4555-8c25-0e7669ad0269" (UID: "a69828df-e6d0-4555-8c25-0e7669ad0269"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.304385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b295ca4-d536-4367-92e2-746945500e9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b295ca4-d536-4367-92e2-746945500e9a" (UID: "8b295ca4-d536-4367-92e2-746945500e9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.304489 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-scripts" (OuterVolumeSpecName: "scripts") pod "a69828df-e6d0-4555-8c25-0e7669ad0269" (UID: "a69828df-e6d0-4555-8c25-0e7669ad0269"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.306376 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a69828df-e6d0-4555-8c25-0e7669ad0269" (UID: "a69828df-e6d0-4555-8c25-0e7669ad0269"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.306450 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a69828df-e6d0-4555-8c25-0e7669ad0269" (UID: "a69828df-e6d0-4555-8c25-0e7669ad0269"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.311808 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b295ca4-d536-4367-92e2-746945500e9a-kube-api-access-hxcnm" (OuterVolumeSpecName: "kube-api-access-hxcnm") pod "8b295ca4-d536-4367-92e2-746945500e9a" (UID: "8b295ca4-d536-4367-92e2-746945500e9a"). InnerVolumeSpecName "kube-api-access-hxcnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.312022 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69828df-e6d0-4555-8c25-0e7669ad0269-kube-api-access-xn4h4" (OuterVolumeSpecName: "kube-api-access-xn4h4") pod "a69828df-e6d0-4555-8c25-0e7669ad0269" (UID: "a69828df-e6d0-4555-8c25-0e7669ad0269"). InnerVolumeSpecName "kube-api-access-xn4h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.313219 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.378741 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.404824 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-operator-scripts\") pod \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.404877 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nkr7\" (UniqueName: \"kubernetes.io/projected/63bd7fa6-422c-44c1-8d65-569d82a716f1-kube-api-access-9nkr7\") pod \"63bd7fa6-422c-44c1-8d65-569d82a716f1\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.404947 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zpdv\" (UniqueName: \"kubernetes.io/projected/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-kube-api-access-4zpdv\") pod \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\" (UID: \"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405007 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krkmp\" (UniqueName: \"kubernetes.io/projected/06bf42dd-19ce-4bef-af7c-4777a04fbdca-kube-api-access-krkmp\") pod \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405099 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63bd7fa6-422c-44c1-8d65-569d82a716f1-operator-scripts\") pod \"63bd7fa6-422c-44c1-8d65-569d82a716f1\" (UID: \"63bd7fa6-422c-44c1-8d65-569d82a716f1\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405164 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12040694-1a05-486f-a71d-eb6940d21985-operator-scripts\") pod \"12040694-1a05-486f-a71d-eb6940d21985\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllzp\" (UniqueName: \"kubernetes.io/projected/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-kube-api-access-zllzp\") pod \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405276 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bf42dd-19ce-4bef-af7c-4777a04fbdca-operator-scripts\") pod \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\" (UID: \"06bf42dd-19ce-4bef-af7c-4777a04fbdca\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405313 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-operator-scripts\") pod \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\" (UID: \"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405349 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz5ck\" (UniqueName: \"kubernetes.io/projected/12040694-1a05-486f-a71d-eb6940d21985-kube-api-access-fz5ck\") pod \"12040694-1a05-486f-a71d-eb6940d21985\" (UID: \"12040694-1a05-486f-a71d-eb6940d21985\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405821 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405840 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b295ca4-d536-4367-92e2-746945500e9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405854 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxcnm\" (UniqueName: \"kubernetes.io/projected/8b295ca4-d536-4367-92e2-746945500e9a-kube-api-access-hxcnm\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405866 4834 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a69828df-e6d0-4555-8c25-0e7669ad0269-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405876 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn4h4\" (UniqueName: \"kubernetes.io/projected/a69828df-e6d0-4555-8c25-0e7669ad0269-kube-api-access-xn4h4\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405887 4834 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.405897 4834 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a69828df-e6d0-4555-8c25-0e7669ad0269-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.407036 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06bf42dd-19ce-4bef-af7c-4777a04fbdca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06bf42dd-19ce-4bef-af7c-4777a04fbdca" (UID: "06bf42dd-19ce-4bef-af7c-4777a04fbdca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.407200 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12040694-1a05-486f-a71d-eb6940d21985-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12040694-1a05-486f-a71d-eb6940d21985" (UID: "12040694-1a05-486f-a71d-eb6940d21985"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.407214 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" (UID: "83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.407663 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" (UID: "4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.407860 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bd7fa6-422c-44c1-8d65-569d82a716f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63bd7fa6-422c-44c1-8d65-569d82a716f1" (UID: "63bd7fa6-422c-44c1-8d65-569d82a716f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.409483 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12040694-1a05-486f-a71d-eb6940d21985-kube-api-access-fz5ck" (OuterVolumeSpecName: "kube-api-access-fz5ck") pod "12040694-1a05-486f-a71d-eb6940d21985" (UID: "12040694-1a05-486f-a71d-eb6940d21985"). InnerVolumeSpecName "kube-api-access-fz5ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.411118 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06bf42dd-19ce-4bef-af7c-4777a04fbdca-kube-api-access-krkmp" (OuterVolumeSpecName: "kube-api-access-krkmp") pod "06bf42dd-19ce-4bef-af7c-4777a04fbdca" (UID: "06bf42dd-19ce-4bef-af7c-4777a04fbdca"). InnerVolumeSpecName "kube-api-access-krkmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.411661 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-kube-api-access-zllzp" (OuterVolumeSpecName: "kube-api-access-zllzp") pod "83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" (UID: "83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7"). InnerVolumeSpecName "kube-api-access-zllzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.412804 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bd7fa6-422c-44c1-8d65-569d82a716f1-kube-api-access-9nkr7" (OuterVolumeSpecName: "kube-api-access-9nkr7") pod "63bd7fa6-422c-44c1-8d65-569d82a716f1" (UID: "63bd7fa6-422c-44c1-8d65-569d82a716f1"). InnerVolumeSpecName "kube-api-access-9nkr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.413309 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-kube-api-access-4zpdv" (OuterVolumeSpecName: "kube-api-access-4zpdv") pod "4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" (UID: "4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4"). InnerVolumeSpecName "kube-api-access-4zpdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.492040 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-4d6vx"] Jan 30 21:37:02 crc kubenswrapper[4834]: W0130 21:37:02.498709 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0524d426_e238_417b_a49d_1b2ea6e37e01.slice/crio-d11304e955545aff78493472b8be99fa3fe9e92ef7b1005b8c30913a22706474 WatchSource:0}: Error finding container d11304e955545aff78493472b8be99fa3fe9e92ef7b1005b8c30913a22706474: Status 404 returned error can't find the container with id d11304e955545aff78493472b8be99fa3fe9e92ef7b1005b8c30913a22706474 Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.507084 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37293f9c-d144-4553-ab7c-04caf3e28d18-operator-scripts\") pod \"37293f9c-d144-4553-ab7c-04caf3e28d18\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.507128 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4dwk\" (UniqueName: \"kubernetes.io/projected/37293f9c-d144-4553-ab7c-04caf3e28d18-kube-api-access-n4dwk\") pod \"37293f9c-d144-4553-ab7c-04caf3e28d18\" (UID: \"37293f9c-d144-4553-ab7c-04caf3e28d18\") " Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.507499 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37293f9c-d144-4553-ab7c-04caf3e28d18-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37293f9c-d144-4553-ab7c-04caf3e28d18" (UID: "37293f9c-d144-4553-ab7c-04caf3e28d18"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508543 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508568 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz5ck\" (UniqueName: \"kubernetes.io/projected/12040694-1a05-486f-a71d-eb6940d21985-kube-api-access-fz5ck\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508581 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508590 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nkr7\" (UniqueName: \"kubernetes.io/projected/63bd7fa6-422c-44c1-8d65-569d82a716f1-kube-api-access-9nkr7\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508599 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zpdv\" (UniqueName: \"kubernetes.io/projected/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4-kube-api-access-4zpdv\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508610 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krkmp\" (UniqueName: \"kubernetes.io/projected/06bf42dd-19ce-4bef-af7c-4777a04fbdca-kube-api-access-krkmp\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508640 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63bd7fa6-422c-44c1-8d65-569d82a716f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508648 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37293f9c-d144-4553-ab7c-04caf3e28d18-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508673 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12040694-1a05-486f-a71d-eb6940d21985-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508683 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zllzp\" (UniqueName: \"kubernetes.io/projected/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7-kube-api-access-zllzp\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.508691 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06bf42dd-19ce-4bef-af7c-4777a04fbdca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.511437 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37293f9c-d144-4553-ab7c-04caf3e28d18-kube-api-access-n4dwk" (OuterVolumeSpecName: "kube-api-access-n4dwk") pod "37293f9c-d144-4553-ab7c-04caf3e28d18" (UID: "37293f9c-d144-4553-ab7c-04caf3e28d18"). InnerVolumeSpecName "kube-api-access-n4dwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.610925 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4dwk\" (UniqueName: \"kubernetes.io/projected/37293f9c-d144-4553-ab7c-04caf3e28d18-kube-api-access-n4dwk\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.832219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rhwh6" event={"ID":"8b295ca4-d536-4367-92e2-746945500e9a","Type":"ContainerDied","Data":"899cd0d4a5fe2bc7db2b77d105ae7fc7ad2946d1f0c4e8df8aca94e161909118"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.832267 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899cd0d4a5fe2bc7db2b77d105ae7fc7ad2946d1f0c4e8df8aca94e161909118" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.832334 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rhwh6" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.835677 4834 generic.go:334] "Generic (PLEG): container finished" podID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerID="6861126edf3b618720d2b51d442d9b46f20a3cd55a95bc67558ad26a1be3a7bf" exitCode=0 Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.835744 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" event={"ID":"0524d426-e238-417b-a49d-1b2ea6e37e01","Type":"ContainerDied","Data":"6861126edf3b618720d2b51d442d9b46f20a3cd55a95bc67558ad26a1be3a7bf"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.835774 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" event={"ID":"0524d426-e238-417b-a49d-1b2ea6e37e01","Type":"ContainerStarted","Data":"d11304e955545aff78493472b8be99fa3fe9e92ef7b1005b8c30913a22706474"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.838791 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d0f-account-create-update-wjvkc" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.838842 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d0f-account-create-update-wjvkc" event={"ID":"4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4","Type":"ContainerDied","Data":"60b5e16e07fd44f14a88f71bab5914d174456be8c38c20f2abc99240c2b2e8ef"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.838876 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60b5e16e07fd44f14a88f71bab5914d174456be8c38c20f2abc99240c2b2e8ef" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.849836 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrsl9" event={"ID":"83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7","Type":"ContainerDied","Data":"db9e8b75b12909b2465fd79aa0f212f5c2c54caedf765ace8a6388c8134eca20"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.849874 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db9e8b75b12909b2465fd79aa0f212f5c2c54caedf765ace8a6388c8134eca20" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.849926 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrsl9" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.855542 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5xww2" event={"ID":"12040694-1a05-486f-a71d-eb6940d21985","Type":"ContainerDied","Data":"89d4844a3b6e18dd1fd2f138d675daa4e0afd09c87272f2bd582f4ad6d7ac91a"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.855608 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d4844a3b6e18dd1fd2f138d675daa4e0afd09c87272f2bd582f4ad6d7ac91a" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.856115 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5xww2" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.863194 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6vxnz" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.863185 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6vxnz" event={"ID":"63bd7fa6-422c-44c1-8d65-569d82a716f1","Type":"ContainerDied","Data":"032da939f22daf31d1e4bbb95c0577ee5c93ff4a0e83ea6a7140bc4d1506ce04"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.863306 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="032da939f22daf31d1e4bbb95c0577ee5c93ff4a0e83ea6a7140bc4d1506ce04" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.870332 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t2k4l-config-wxkd5" event={"ID":"a69828df-e6d0-4555-8c25-0e7669ad0269","Type":"ContainerDied","Data":"cc44f5a1277deb61857e63216432ee9e87c475cc2162b9ecc803ca258e954e9c"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.870693 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc44f5a1277deb61857e63216432ee9e87c475cc2162b9ecc803ca258e954e9c" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.870637 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t2k4l-config-wxkd5" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.873853 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dfde-account-create-update-m4mzz" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.873914 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dfde-account-create-update-m4mzz" event={"ID":"37293f9c-d144-4553-ab7c-04caf3e28d18","Type":"ContainerDied","Data":"40b77c1eee2cd175511a676a85b60182e02f867e6e9c220080ef77daffc323b2"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.873981 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40b77c1eee2cd175511a676a85b60182e02f867e6e9c220080ef77daffc323b2" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.878022 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2700-account-create-update-kqx8q" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.878212 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2700-account-create-update-kqx8q" event={"ID":"06bf42dd-19ce-4bef-af7c-4777a04fbdca","Type":"ContainerDied","Data":"b7370f47e823951cdd8ae76b8c933bcd300dd30c94c0ca8846ca305b120e67d3"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.878253 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7370f47e823951cdd8ae76b8c933bcd300dd30c94c0ca8846ca305b120e67d3" Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.880582 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zct6g" event={"ID":"95a073f1-b8e0-4365-ae24-c4e2351a7f0e","Type":"ContainerStarted","Data":"359014c5a5c906e4181ee98cf98574f0f2a28dbdd4a849cf7f8f605dc9d542df"} Jan 30 21:37:02 crc kubenswrapper[4834]: I0130 21:37:02.940926 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zct6g" podStartSLOduration=7.895896509 podStartE2EDuration="12.940905848s" podCreationTimestamp="2026-01-30 21:36:50 +0000 UTC" firstStartedPulling="2026-01-30 21:36:57.008288193 +0000 UTC m=+1268.161434331" lastFinishedPulling="2026-01-30 21:37:02.053297512 +0000 UTC m=+1273.206443670" observedRunningTime="2026-01-30 21:37:02.92458177 +0000 UTC m=+1274.077727908" watchObservedRunningTime="2026-01-30 21:37:02.940905848 +0000 UTC m=+1274.094051986" Jan 30 21:37:03 crc kubenswrapper[4834]: I0130 21:37:03.386685 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-t2k4l-config-wxkd5"] Jan 30 21:37:03 crc kubenswrapper[4834]: I0130 21:37:03.418897 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-t2k4l-config-wxkd5"] Jan 30 21:37:03 crc kubenswrapper[4834]: I0130 21:37:03.540255 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69828df-e6d0-4555-8c25-0e7669ad0269" path="/var/lib/kubelet/pods/a69828df-e6d0-4555-8c25-0e7669ad0269/volumes" Jan 30 21:37:03 crc kubenswrapper[4834]: I0130 21:37:03.893549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" event={"ID":"0524d426-e238-417b-a49d-1b2ea6e37e01","Type":"ContainerStarted","Data":"278fb0350aca8cc7e4857c1a345071829813fb019276031bbc82b9dc05fc597d"} Jan 30 21:37:03 crc kubenswrapper[4834]: I0130 21:37:03.893620 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:03 crc kubenswrapper[4834]: I0130 21:37:03.924497 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" podStartSLOduration=3.92447023 podStartE2EDuration="3.92447023s" podCreationTimestamp="2026-01-30 21:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:03.922935417 +0000 UTC m=+1275.076081585" watchObservedRunningTime="2026-01-30 21:37:03.92447023 +0000 UTC m=+1275.077616408" Jan 30 21:37:04 crc kubenswrapper[4834]: I0130 21:37:04.906351 4834 generic.go:334] "Generic (PLEG): container finished" podID="b88d2fad-923e-4f77-b8f2-d8be1858e1ff" containerID="b296177a63a222d735441c2738ca905fc4aa4146d067cda310e73d329e189780" exitCode=0 Jan 30 21:37:04 crc kubenswrapper[4834]: I0130 21:37:04.906462 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mdpvh" event={"ID":"b88d2fad-923e-4f77-b8f2-d8be1858e1ff","Type":"ContainerDied","Data":"b296177a63a222d735441c2738ca905fc4aa4146d067cda310e73d329e189780"} Jan 30 21:37:05 crc kubenswrapper[4834]: I0130 21:37:05.915580 4834 generic.go:334] "Generic (PLEG): container finished" podID="95a073f1-b8e0-4365-ae24-c4e2351a7f0e" containerID="359014c5a5c906e4181ee98cf98574f0f2a28dbdd4a849cf7f8f605dc9d542df" exitCode=0 Jan 30 21:37:05 crc kubenswrapper[4834]: I0130 21:37:05.915675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zct6g" event={"ID":"95a073f1-b8e0-4365-ae24-c4e2351a7f0e","Type":"ContainerDied","Data":"359014c5a5c906e4181ee98cf98574f0f2a28dbdd4a849cf7f8f605dc9d542df"} Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.437246 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mdpvh" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.604347 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-db-sync-config-data\") pod \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.604463 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-config-data\") pod \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.604592 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-combined-ca-bundle\") pod \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.604708 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqwp\" (UniqueName: \"kubernetes.io/projected/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-kube-api-access-4lqwp\") pod \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\" (UID: \"b88d2fad-923e-4f77-b8f2-d8be1858e1ff\") " Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.611973 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b88d2fad-923e-4f77-b8f2-d8be1858e1ff" (UID: "b88d2fad-923e-4f77-b8f2-d8be1858e1ff"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.612167 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-kube-api-access-4lqwp" (OuterVolumeSpecName: "kube-api-access-4lqwp") pod "b88d2fad-923e-4f77-b8f2-d8be1858e1ff" (UID: "b88d2fad-923e-4f77-b8f2-d8be1858e1ff"). InnerVolumeSpecName "kube-api-access-4lqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.652656 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-config-data" (OuterVolumeSpecName: "config-data") pod "b88d2fad-923e-4f77-b8f2-d8be1858e1ff" (UID: "b88d2fad-923e-4f77-b8f2-d8be1858e1ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.652660 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b88d2fad-923e-4f77-b8f2-d8be1858e1ff" (UID: "b88d2fad-923e-4f77-b8f2-d8be1858e1ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.706539 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.706579 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.706592 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.706606 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lqwp\" (UniqueName: \"kubernetes.io/projected/b88d2fad-923e-4f77-b8f2-d8be1858e1ff-kube-api-access-4lqwp\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.926998 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mdpvh" Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.927032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mdpvh" event={"ID":"b88d2fad-923e-4f77-b8f2-d8be1858e1ff","Type":"ContainerDied","Data":"12943535aca210ae749f92e185f6ac931fc209698b2c829a6ea64e0e252ca899"} Jan 30 21:37:06 crc kubenswrapper[4834]: I0130 21:37:06.927387 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12943535aca210ae749f92e185f6ac931fc209698b2c829a6ea64e0e252ca899" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.343801 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-4d6vx"] Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.343996 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerName="dnsmasq-dns" containerID="cri-o://278fb0350aca8cc7e4857c1a345071829813fb019276031bbc82b9dc05fc597d" gracePeriod=10 Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.379685 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-lchpg"] Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380023 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bd7fa6-422c-44c1-8d65-569d82a716f1" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380038 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bd7fa6-422c-44c1-8d65-569d82a716f1" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380047 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06bf42dd-19ce-4bef-af7c-4777a04fbdca" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380054 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="06bf42dd-19ce-4bef-af7c-4777a04fbdca" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380065 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69828df-e6d0-4555-8c25-0e7669ad0269" containerName="ovn-config" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380072 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69828df-e6d0-4555-8c25-0e7669ad0269" containerName="ovn-config" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380086 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380091 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380102 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12040694-1a05-486f-a71d-eb6940d21985" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380108 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="12040694-1a05-486f-a71d-eb6940d21985" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380119 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88d2fad-923e-4f77-b8f2-d8be1858e1ff" containerName="glance-db-sync" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380124 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88d2fad-923e-4f77-b8f2-d8be1858e1ff" containerName="glance-db-sync" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380140 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b295ca4-d536-4367-92e2-746945500e9a" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380145 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b295ca4-d536-4367-92e2-746945500e9a" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380154 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37293f9c-d144-4553-ab7c-04caf3e28d18" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380160 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="37293f9c-d144-4553-ab7c-04caf3e28d18" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: E0130 21:37:07.380171 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380176 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380313 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="12040694-1a05-486f-a71d-eb6940d21985" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380328 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="06bf42dd-19ce-4bef-af7c-4777a04fbdca" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380348 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88d2fad-923e-4f77-b8f2-d8be1858e1ff" containerName="glance-db-sync" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380362 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="37293f9c-d144-4553-ab7c-04caf3e28d18" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380370 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b295ca4-d536-4367-92e2-746945500e9a" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380380 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380387 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69828df-e6d0-4555-8c25-0e7669ad0269" containerName="ovn-config" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380409 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bd7fa6-422c-44c1-8d65-569d82a716f1" containerName="mariadb-database-create" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.380421 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" containerName="mariadb-account-create-update" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.381285 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.401858 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-lchpg"] Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.419115 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scp4c\" (UniqueName: \"kubernetes.io/projected/6046ea38-25fd-4180-b390-2112e5e61ecf-kube-api-access-scp4c\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.419184 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.419209 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.419247 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-svc\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.419293 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.419319 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-config\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.436869 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zct6g" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.522289 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-svc\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.522453 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.522512 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-config\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.522557 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scp4c\" (UniqueName: \"kubernetes.io/projected/6046ea38-25fd-4180-b390-2112e5e61ecf-kube-api-access-scp4c\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.522677 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.522719 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.524410 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-svc\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.525882 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.527411 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.527891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.527944 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-config\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.552077 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scp4c\" (UniqueName: \"kubernetes.io/projected/6046ea38-25fd-4180-b390-2112e5e61ecf-kube-api-access-scp4c\") pod \"dnsmasq-dns-895cf5cf-lchpg\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.623936 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqkbj\" (UniqueName: \"kubernetes.io/projected/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-kube-api-access-lqkbj\") pod \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.624003 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-config-data\") pod \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.624068 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-combined-ca-bundle\") pod \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\" (UID: \"95a073f1-b8e0-4365-ae24-c4e2351a7f0e\") " Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.630574 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-kube-api-access-lqkbj" (OuterVolumeSpecName: "kube-api-access-lqkbj") pod "95a073f1-b8e0-4365-ae24-c4e2351a7f0e" (UID: "95a073f1-b8e0-4365-ae24-c4e2351a7f0e"). InnerVolumeSpecName "kube-api-access-lqkbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.664972 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95a073f1-b8e0-4365-ae24-c4e2351a7f0e" (UID: "95a073f1-b8e0-4365-ae24-c4e2351a7f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.682247 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-config-data" (OuterVolumeSpecName: "config-data") pod "95a073f1-b8e0-4365-ae24-c4e2351a7f0e" (UID: "95a073f1-b8e0-4365-ae24-c4e2351a7f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.727247 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqkbj\" (UniqueName: \"kubernetes.io/projected/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-kube-api-access-lqkbj\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.727275 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.727286 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95a073f1-b8e0-4365-ae24-c4e2351a7f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.744700 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.939524 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zct6g" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.939551 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zct6g" event={"ID":"95a073f1-b8e0-4365-ae24-c4e2351a7f0e","Type":"ContainerDied","Data":"4e13efd2f75fab401466db43a4a39f4e6dcb89c4e6101435f04e350251a8025d"} Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.939619 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e13efd2f75fab401466db43a4a39f4e6dcb89c4e6101435f04e350251a8025d" Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.942609 4834 generic.go:334] "Generic (PLEG): container finished" podID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerID="278fb0350aca8cc7e4857c1a345071829813fb019276031bbc82b9dc05fc597d" exitCode=0 Jan 30 21:37:07 crc kubenswrapper[4834]: I0130 21:37:07.942646 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" event={"ID":"0524d426-e238-417b-a49d-1b2ea6e37e01","Type":"ContainerDied","Data":"278fb0350aca8cc7e4857c1a345071829813fb019276031bbc82b9dc05fc597d"} Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.138927 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-lchpg"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.157733 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-24kbx"] Jan 30 21:37:08 crc kubenswrapper[4834]: E0130 21:37:08.164816 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a073f1-b8e0-4365-ae24-c4e2351a7f0e" containerName="keystone-db-sync" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.164852 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a073f1-b8e0-4365-ae24-c4e2351a7f0e" containerName="keystone-db-sync" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.165137 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a073f1-b8e0-4365-ae24-c4e2351a7f0e" containerName="keystone-db-sync" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.165734 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.170710 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.170798 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.170813 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.171078 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.171244 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzg4z" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.171317 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-f2xqb"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.172687 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.186941 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-24kbx"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.240459 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-f2xqb"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.243022 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-fernet-keys\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.243139 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-scripts\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.243162 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-config-data\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.243189 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-credential-keys\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.243212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-combined-ca-bundle\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.280580 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-lchpg"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.344764 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.344808 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-scripts\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.344830 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.344876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-config-data\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.344897 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345163 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-credential-keys\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345240 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-combined-ca-bundle\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdnzc\" (UniqueName: \"kubernetes.io/projected/880957f8-122f-48c9-b4d4-0082de01a50a-kube-api-access-bdnzc\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345306 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8wh\" (UniqueName: \"kubernetes.io/projected/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-kube-api-access-fq8wh\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-fernet-keys\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345556 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.345663 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-config\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.349232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-scripts\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.356195 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-credential-keys\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.357420 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-config-data\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.370146 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-gvxl7"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.370666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-combined-ca-bundle\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.371142 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.374042 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.374309 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9tk2h" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.386323 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.417242 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gvxl7"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-config\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447480 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447507 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447538 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdnzc\" (UniqueName: \"kubernetes.io/projected/880957f8-122f-48c9-b4d4-0082de01a50a-kube-api-access-bdnzc\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447616 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8wh\" (UniqueName: \"kubernetes.io/projected/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-kube-api-access-fq8wh\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.447699 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.449235 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.453121 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.453237 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-svc\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.453845 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.457536 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-config\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.475972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdnzc\" (UniqueName: \"kubernetes.io/projected/880957f8-122f-48c9-b4d4-0082de01a50a-kube-api-access-bdnzc\") pod \"dnsmasq-dns-6c9c9f998c-f2xqb\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.481338 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8wh\" (UniqueName: \"kubernetes.io/projected/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-kube-api-access-fq8wh\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.533365 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-fernet-keys\") pod \"keystone-bootstrap-24kbx\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.533519 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.535318 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.563785 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-combined-ca-bundle\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.563894 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xc49\" (UniqueName: \"kubernetes.io/projected/bb10ef46-028a-4e54-a587-4843dae377f9-kube-api-access-8xc49\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.564200 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-config\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.590512 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2ngf2"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.592533 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.598134 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.598483 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.599960 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-29w88" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.679979 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-config\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.680291 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-combined-ca-bundle\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.680331 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xc49\" (UniqueName: \"kubernetes.io/projected/bb10ef46-028a-4e54-a587-4843dae377f9-kube-api-access-8xc49\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.685216 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ftcrs"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.686445 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.696095 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fnmp7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.696288 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.713158 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.713869 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ftcrs"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.716894 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-combined-ca-bundle\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.719017 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-config\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.728521 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2ngf2"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.749013 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-5v96n"] Jan 30 21:37:08 crc kubenswrapper[4834]: E0130 21:37:08.749442 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerName="dnsmasq-dns" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.749458 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerName="dnsmasq-dns" Jan 30 21:37:08 crc kubenswrapper[4834]: E0130 21:37:08.749482 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerName="init" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.749489 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerName="init" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.756410 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xc49\" (UniqueName: \"kubernetes.io/projected/bb10ef46-028a-4e54-a587-4843dae377f9-kube-api-access-8xc49\") pod \"neutron-db-sync-gvxl7\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.768855 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" containerName="dnsmasq-dns" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.769512 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.775768 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.775948 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zqf4b" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.776142 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.790677 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-scripts\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.790742 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-combined-ca-bundle\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.790796 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9knn\" (UniqueName: \"kubernetes.io/projected/25c62e65-34f5-4f5c-83fd-9af56b711bac-kube-api-access-l9knn\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.790836 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-db-sync-config-data\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.790859 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c62e65-34f5-4f5c-83fd-9af56b711bac-etc-machine-id\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.790899 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-config-data\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.807531 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5v96n"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.845276 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.849027 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.856611 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.856757 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.890433 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-f2xqb"] Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896084 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-sb\") pod \"0524d426-e238-417b-a49d-1b2ea6e37e01\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896127 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-swift-storage-0\") pod \"0524d426-e238-417b-a49d-1b2ea6e37e01\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896177 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-svc\") pod \"0524d426-e238-417b-a49d-1b2ea6e37e01\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896197 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-nb\") pod \"0524d426-e238-417b-a49d-1b2ea6e37e01\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896213 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-config\") pod \"0524d426-e238-417b-a49d-1b2ea6e37e01\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896255 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gpkm\" (UniqueName: \"kubernetes.io/projected/0524d426-e238-417b-a49d-1b2ea6e37e01-kube-api-access-2gpkm\") pod \"0524d426-e238-417b-a49d-1b2ea6e37e01\" (UID: \"0524d426-e238-417b-a49d-1b2ea6e37e01\") " Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-combined-ca-bundle\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctnhx\" (UniqueName: \"kubernetes.io/projected/b49f42b3-c35a-4138-89e6-95f7abfa23bb-kube-api-access-ctnhx\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896384 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-scripts\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896415 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-scripts\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896446 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-config-data\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896463 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djrmb\" (UniqueName: \"kubernetes.io/projected/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-kube-api-access-djrmb\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e680c-6380-4ab3-a481-2f9afe8b88ff-logs\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896510 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-run-httpd\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896524 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-db-sync-config-data\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896556 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-config-data\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896572 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-scripts\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.896596 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899033 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-combined-ca-bundle\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899098 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-config-data\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-log-httpd\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899181 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9knn\" (UniqueName: \"kubernetes.io/projected/25c62e65-34f5-4f5c-83fd-9af56b711bac-kube-api-access-l9knn\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899248 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-db-sync-config-data\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-combined-ca-bundle\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899319 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c62e65-34f5-4f5c-83fd-9af56b711bac-etc-machine-id\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.899336 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vlr\" (UniqueName: \"kubernetes.io/projected/814e680c-6380-4ab3-a481-2f9afe8b88ff-kube-api-access-75vlr\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.906920 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-scripts\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.907047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c62e65-34f5-4f5c-83fd-9af56b711bac-etc-machine-id\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.907581 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-config-data\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.920606 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-combined-ca-bundle\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.940830 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.958334 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0524d426-e238-417b-a49d-1b2ea6e37e01-kube-api-access-2gpkm" (OuterVolumeSpecName: "kube-api-access-2gpkm") pod "0524d426-e238-417b-a49d-1b2ea6e37e01" (UID: "0524d426-e238-417b-a49d-1b2ea6e37e01"). InnerVolumeSpecName "kube-api-access-2gpkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.960900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-db-sync-config-data\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.962576 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9knn\" (UniqueName: \"kubernetes.io/projected/25c62e65-34f5-4f5c-83fd-9af56b711bac-kube-api-access-l9knn\") pod \"cinder-db-sync-2ngf2\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.983896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" event={"ID":"6046ea38-25fd-4180-b390-2112e5e61ecf","Type":"ContainerStarted","Data":"fdccb4db78dc07ffe5f7bf980a1ea81c4a99645db1a55df9c4d9d0aff727fab9"} Jan 30 21:37:08 crc kubenswrapper[4834]: I0130 21:37:08.996508 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002125 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-log-httpd\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002247 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-combined-ca-bundle\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002286 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75vlr\" (UniqueName: \"kubernetes.io/projected/814e680c-6380-4ab3-a481-2f9afe8b88ff-kube-api-access-75vlr\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002306 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-combined-ca-bundle\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002323 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctnhx\" (UniqueName: \"kubernetes.io/projected/b49f42b3-c35a-4138-89e6-95f7abfa23bb-kube-api-access-ctnhx\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002339 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-scripts\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002365 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-scripts\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002419 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djrmb\" (UniqueName: \"kubernetes.io/projected/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-kube-api-access-djrmb\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002448 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e680c-6380-4ab3-a481-2f9afe8b88ff-logs\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002501 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-run-httpd\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002533 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-db-sync-config-data\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-config-data\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002604 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002658 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-config-data\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.002716 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gpkm\" (UniqueName: \"kubernetes.io/projected/0524d426-e238-417b-a49d-1b2ea6e37e01-kube-api-access-2gpkm\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.005496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-log-httpd\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.006877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-combined-ca-bundle\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.007191 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" event={"ID":"0524d426-e238-417b-a49d-1b2ea6e37e01","Type":"ContainerDied","Data":"d11304e955545aff78493472b8be99fa3fe9e92ef7b1005b8c30913a22706474"} Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.007264 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-run-httpd\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.007276 4834 scope.go:117] "RemoveContainer" containerID="278fb0350aca8cc7e4857c1a345071829813fb019276031bbc82b9dc05fc597d" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.008465 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e680c-6380-4ab3-a481-2f9afe8b88ff-logs\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.008609 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d5b6d6b67-4d6vx" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.019813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-scripts\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.020295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.021325 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-combined-ca-bundle\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.023078 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-config-data\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.026191 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-scripts\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.029185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vlr\" (UniqueName: \"kubernetes.io/projected/814e680c-6380-4ab3-a481-2f9afe8b88ff-kube-api-access-75vlr\") pod \"placement-db-sync-5v96n\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.029559 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djrmb\" (UniqueName: \"kubernetes.io/projected/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-kube-api-access-djrmb\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.029828 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-db-sync-config-data\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.033080 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctnhx\" (UniqueName: \"kubernetes.io/projected/b49f42b3-c35a-4138-89e6-95f7abfa23bb-kube-api-access-ctnhx\") pod \"barbican-db-sync-ftcrs\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.036309 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.041064 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-4lk6b"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.042157 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.045886 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.070725 4834 scope.go:117] "RemoveContainer" containerID="6861126edf3b618720d2b51d442d9b46f20a3cd55a95bc67558ad26a1be3a7bf" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.072580 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-config-data\") pod \"ceilometer-0\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.087979 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-4lk6b"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.105274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.105357 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.107453 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.107537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.107675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69pq\" (UniqueName: \"kubernetes.io/projected/3dd33833-530e-4a48-9a21-c03a37d1c253-kube-api-access-g69pq\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.107749 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-config\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.109622 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.155480 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0524d426-e238-417b-a49d-1b2ea6e37e01" (UID: "0524d426-e238-417b-a49d-1b2ea6e37e01"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.155790 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0524d426-e238-417b-a49d-1b2ea6e37e01" (UID: "0524d426-e238-417b-a49d-1b2ea6e37e01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.160311 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0524d426-e238-417b-a49d-1b2ea6e37e01" (UID: "0524d426-e238-417b-a49d-1b2ea6e37e01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.163246 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-config" (OuterVolumeSpecName: "config") pod "0524d426-e238-417b-a49d-1b2ea6e37e01" (UID: "0524d426-e238-417b-a49d-1b2ea6e37e01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.184231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0524d426-e238-417b-a49d-1b2ea6e37e01" (UID: "0524d426-e238-417b-a49d-1b2ea6e37e01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.195112 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.209767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g69pq\" (UniqueName: \"kubernetes.io/projected/3dd33833-530e-4a48-9a21-c03a37d1c253-kube-api-access-g69pq\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.209844 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-config\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.209870 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.209901 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.209954 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.209991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.210067 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.210080 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.210089 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.210097 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.210105 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0524d426-e238-417b-a49d-1b2ea6e37e01-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.211268 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.211275 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.211837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-config\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.213100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.213310 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.228932 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69pq\" (UniqueName: \"kubernetes.io/projected/3dd33833-530e-4a48-9a21-c03a37d1c253-kube-api-access-g69pq\") pod \"dnsmasq-dns-57c957c4ff-4lk6b\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.269379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.296852 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.299520 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.303598 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nk74b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.304101 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.308826 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.332411 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.373083 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.381207 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.384926 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.386213 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.388087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.418574 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-logs\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.419351 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.419424 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.419466 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.419524 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlqpg\" (UniqueName: \"kubernetes.io/projected/10713394-dd58-4421-b339-af18a8ab723e-kube-api-access-dlqpg\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.419561 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.419584 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.423076 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-4d6vx"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.496926 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d5b6d6b67-4d6vx"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558171 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-logs\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558285 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-logs\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558359 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558386 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j654c\" (UniqueName: \"kubernetes.io/projected/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-kube-api-access-j654c\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558461 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-config-data\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558577 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558652 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlqpg\" (UniqueName: \"kubernetes.io/projected/10713394-dd58-4421-b339-af18a8ab723e-kube-api-access-dlqpg\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558676 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-scripts\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558721 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558802 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.558869 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-logs\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.559121 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.561922 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.573776 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0524d426-e238-417b-a49d-1b2ea6e37e01" path="/var/lib/kubelet/pods/0524d426-e238-417b-a49d-1b2ea6e37e01/volumes" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.574454 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-24kbx"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.574692 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.581427 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.587293 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlqpg\" (UniqueName: \"kubernetes.io/projected/10713394-dd58-4421-b339-af18a8ab723e-kube-api-access-dlqpg\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.587940 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.602933 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.614223 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-f2xqb"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.631175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j654c\" (UniqueName: \"kubernetes.io/projected/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-kube-api-access-j654c\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663496 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-config-data\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663579 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-scripts\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663596 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663611 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663679 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-logs\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.663727 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.673368 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-config-data\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.675139 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-logs\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.675142 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.675568 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.680231 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.680640 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-scripts\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.685477 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j654c\" (UniqueName: \"kubernetes.io/projected/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-kube-api-access-j654c\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.745500 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-gvxl7"] Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.746826 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.852204 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2ngf2"] Jan 30 21:37:09 crc kubenswrapper[4834]: W0130 21:37:09.854520 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25c62e65_34f5_4f5c_83fd_9af56b711bac.slice/crio-d0ffcec7bd3450452b74c854c93b31675624d5b17785819d17b823e8f4a7214c WatchSource:0}: Error finding container d0ffcec7bd3450452b74c854c93b31675624d5b17785819d17b823e8f4a7214c: Status 404 returned error can't find the container with id d0ffcec7bd3450452b74c854c93b31675624d5b17785819d17b823e8f4a7214c Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.973359 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ftcrs"] Jan 30 21:37:09 crc kubenswrapper[4834]: W0130 21:37:09.980457 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb49f42b3_c35a_4138_89e6_95f7abfa23bb.slice/crio-71659e5bda75e5aa6bfe9b4af640f68056e8af7ad8a3a77482d2e50e93c88217 WatchSource:0}: Error finding container 71659e5bda75e5aa6bfe9b4af640f68056e8af7ad8a3a77482d2e50e93c88217: Status 404 returned error can't find the container with id 71659e5bda75e5aa6bfe9b4af640f68056e8af7ad8a3a77482d2e50e93c88217 Jan 30 21:37:09 crc kubenswrapper[4834]: I0130 21:37:09.993619 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.006039 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5v96n"] Jan 30 21:37:10 crc kubenswrapper[4834]: W0130 21:37:10.012185 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc9c2a7c_f743_4351_b5b0_81ce8ef6b813.slice/crio-ae9063fcf9e1456c6400a6874fad5c401ad0139fbe25df31cd4fd667c0124225 WatchSource:0}: Error finding container ae9063fcf9e1456c6400a6874fad5c401ad0139fbe25df31cd4fd667c0124225: Status 404 returned error can't find the container with id ae9063fcf9e1456c6400a6874fad5c401ad0139fbe25df31cd4fd667c0124225 Jan 30 21:37:10 crc kubenswrapper[4834]: W0130 21:37:10.034201 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814e680c_6380_4ab3_a481_2f9afe8b88ff.slice/crio-4ed7656294fd66a00ca2ff282955ec0b8054c2efa92691aeeb8554a873a9c6e5 WatchSource:0}: Error finding container 4ed7656294fd66a00ca2ff282955ec0b8054c2efa92691aeeb8554a873a9c6e5: Status 404 returned error can't find the container with id 4ed7656294fd66a00ca2ff282955ec0b8054c2efa92691aeeb8554a873a9c6e5 Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.034679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" event={"ID":"880957f8-122f-48c9-b4d4-0082de01a50a","Type":"ContainerStarted","Data":"2ade48ccd4fcc866aa2ae4e147b19d1fc41af6b5d6cea1dbadca0a56efc37b5b"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.034742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" event={"ID":"880957f8-122f-48c9-b4d4-0082de01a50a","Type":"ContainerStarted","Data":"6f732e8782d12b295b0179767dcc3befd9eae155f1a51765b6c83b3c4ce1b40f"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.037288 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.039512 4834 generic.go:334] "Generic (PLEG): container finished" podID="6046ea38-25fd-4180-b390-2112e5e61ecf" containerID="6d9061a5ffae30dad698254b3bb337c979093073ee1f3a6784444fdbc2a66fed" exitCode=0 Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.039586 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" event={"ID":"6046ea38-25fd-4180-b390-2112e5e61ecf","Type":"ContainerDied","Data":"6d9061a5ffae30dad698254b3bb337c979093073ee1f3a6784444fdbc2a66fed"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.041529 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ftcrs" event={"ID":"b49f42b3-c35a-4138-89e6-95f7abfa23bb","Type":"ContainerStarted","Data":"71659e5bda75e5aa6bfe9b4af640f68056e8af7ad8a3a77482d2e50e93c88217"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.044919 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2ngf2" event={"ID":"25c62e65-34f5-4f5c-83fd-9af56b711bac","Type":"ContainerStarted","Data":"d0ffcec7bd3450452b74c854c93b31675624d5b17785819d17b823e8f4a7214c"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.048619 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24kbx" event={"ID":"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc","Type":"ContainerStarted","Data":"55d405e8e2ddbc9806064fff95bbf045eabbeda236a827c41132dbdcdcaa706f"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.048665 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24kbx" event={"ID":"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc","Type":"ContainerStarted","Data":"51e0fe83d4b854cb3e93d7dc885b237c4f7d473ac1a1017c0afec2f5fde3d3ec"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.068936 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvxl7" event={"ID":"bb10ef46-028a-4e54-a587-4843dae377f9","Type":"ContainerStarted","Data":"3611d6a458799e806d5c03d2eff95ca2f023a02330403bf3278b030797f114a6"} Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.183778 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-4lk6b"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.412440 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.531592 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.625334 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.703237 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.713290 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-nb\") pod \"6046ea38-25fd-4180-b390-2112e5e61ecf\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.713453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-config\") pod \"6046ea38-25fd-4180-b390-2112e5e61ecf\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.713517 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-sb\") pod \"6046ea38-25fd-4180-b390-2112e5e61ecf\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.713573 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scp4c\" (UniqueName: \"kubernetes.io/projected/6046ea38-25fd-4180-b390-2112e5e61ecf-kube-api-access-scp4c\") pod \"6046ea38-25fd-4180-b390-2112e5e61ecf\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.713626 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-svc\") pod \"6046ea38-25fd-4180-b390-2112e5e61ecf\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.713650 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-swift-storage-0\") pod \"6046ea38-25fd-4180-b390-2112e5e61ecf\" (UID: \"6046ea38-25fd-4180-b390-2112e5e61ecf\") " Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.732760 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6046ea38-25fd-4180-b390-2112e5e61ecf-kube-api-access-scp4c" (OuterVolumeSpecName: "kube-api-access-scp4c") pod "6046ea38-25fd-4180-b390-2112e5e61ecf" (UID: "6046ea38-25fd-4180-b390-2112e5e61ecf"). InnerVolumeSpecName "kube-api-access-scp4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.766274 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6046ea38-25fd-4180-b390-2112e5e61ecf" (UID: "6046ea38-25fd-4180-b390-2112e5e61ecf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.766730 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6046ea38-25fd-4180-b390-2112e5e61ecf" (UID: "6046ea38-25fd-4180-b390-2112e5e61ecf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.767920 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-config" (OuterVolumeSpecName: "config") pod "6046ea38-25fd-4180-b390-2112e5e61ecf" (UID: "6046ea38-25fd-4180-b390-2112e5e61ecf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.774170 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.777716 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6046ea38-25fd-4180-b390-2112e5e61ecf" (UID: "6046ea38-25fd-4180-b390-2112e5e61ecf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.783055 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6046ea38-25fd-4180-b390-2112e5e61ecf" (UID: "6046ea38-25fd-4180-b390-2112e5e61ecf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.792155 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.816119 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.816244 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.816439 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.816456 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scp4c\" (UniqueName: \"kubernetes.io/projected/6046ea38-25fd-4180-b390-2112e5e61ecf-kube-api-access-scp4c\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.816467 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:10 crc kubenswrapper[4834]: I0130 21:37:10.816475 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046ea38-25fd-4180-b390-2112e5e61ecf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.087580 4834 generic.go:334] "Generic (PLEG): container finished" podID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerID="93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c" exitCode=0 Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.088331 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" event={"ID":"3dd33833-530e-4a48-9a21-c03a37d1c253","Type":"ContainerDied","Data":"93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.088360 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" event={"ID":"3dd33833-530e-4a48-9a21-c03a37d1c253","Type":"ContainerStarted","Data":"f86e0085ac0484154b7953f08ef10c27d4180f5876e4f9fc16822477b210eba5"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.093967 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvxl7" event={"ID":"bb10ef46-028a-4e54-a587-4843dae377f9","Type":"ContainerStarted","Data":"3e7ec2eafede4e5b75a15bc96c461c617bfe6bc2a6ddad530a1360ad09a8f347"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.113561 4834 generic.go:334] "Generic (PLEG): container finished" podID="880957f8-122f-48c9-b4d4-0082de01a50a" containerID="2ade48ccd4fcc866aa2ae4e147b19d1fc41af6b5d6cea1dbadca0a56efc37b5b" exitCode=0 Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.113639 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" event={"ID":"880957f8-122f-48c9-b4d4-0082de01a50a","Type":"ContainerDied","Data":"2ade48ccd4fcc866aa2ae4e147b19d1fc41af6b5d6cea1dbadca0a56efc37b5b"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.124811 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" event={"ID":"6046ea38-25fd-4180-b390-2112e5e61ecf","Type":"ContainerDied","Data":"fdccb4db78dc07ffe5f7bf980a1ea81c4a99645db1a55df9c4d9d0aff727fab9"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.124889 4834 scope.go:117] "RemoveContainer" containerID="6d9061a5ffae30dad698254b3bb337c979093073ee1f3a6784444fdbc2a66fed" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.125060 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-895cf5cf-lchpg" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.135510 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerStarted","Data":"ae9063fcf9e1456c6400a6874fad5c401ad0139fbe25df31cd4fd667c0124225"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.136949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e523b7ff-b98d-4428-b93d-5cd803cbf1a0","Type":"ContainerStarted","Data":"13dfdc5763d3289a48f26fa06257d61794a05ff28fc6739dc2347ffd087b0e26"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.138933 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5v96n" event={"ID":"814e680c-6380-4ab3-a481-2f9afe8b88ff","Type":"ContainerStarted","Data":"4ed7656294fd66a00ca2ff282955ec0b8054c2efa92691aeeb8554a873a9c6e5"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.140345 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-gvxl7" podStartSLOduration=3.1403300180000002 podStartE2EDuration="3.140330018s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:11.13079587 +0000 UTC m=+1282.283942008" watchObservedRunningTime="2026-01-30 21:37:11.140330018 +0000 UTC m=+1282.293476156" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.142362 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10713394-dd58-4421-b339-af18a8ab723e","Type":"ContainerStarted","Data":"b959b796163c14d2d0cc63b805d8ae5b518f527eaca84f9ff8cfe17d7e7a35ad"} Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.229458 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-lchpg"] Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.240335 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-895cf5cf-lchpg"] Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.250644 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-24kbx" podStartSLOduration=3.250622638 podStartE2EDuration="3.250622638s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:11.211454547 +0000 UTC m=+1282.364600685" watchObservedRunningTime="2026-01-30 21:37:11.250622638 +0000 UTC m=+1282.403768766" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.563963 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6046ea38-25fd-4180-b390-2112e5e61ecf" path="/var/lib/kubelet/pods/6046ea38-25fd-4180-b390-2112e5e61ecf/volumes" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.798501 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.987327 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-sb\") pod \"880957f8-122f-48c9-b4d4-0082de01a50a\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.987471 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-swift-storage-0\") pod \"880957f8-122f-48c9-b4d4-0082de01a50a\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.987625 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-svc\") pod \"880957f8-122f-48c9-b4d4-0082de01a50a\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.987676 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-nb\") pod \"880957f8-122f-48c9-b4d4-0082de01a50a\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.987736 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdnzc\" (UniqueName: \"kubernetes.io/projected/880957f8-122f-48c9-b4d4-0082de01a50a-kube-api-access-bdnzc\") pod \"880957f8-122f-48c9-b4d4-0082de01a50a\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " Jan 30 21:37:11 crc kubenswrapper[4834]: I0130 21:37:11.987956 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-config\") pod \"880957f8-122f-48c9-b4d4-0082de01a50a\" (UID: \"880957f8-122f-48c9-b4d4-0082de01a50a\") " Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:11.998915 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880957f8-122f-48c9-b4d4-0082de01a50a-kube-api-access-bdnzc" (OuterVolumeSpecName: "kube-api-access-bdnzc") pod "880957f8-122f-48c9-b4d4-0082de01a50a" (UID: "880957f8-122f-48c9-b4d4-0082de01a50a"). InnerVolumeSpecName "kube-api-access-bdnzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.013150 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "880957f8-122f-48c9-b4d4-0082de01a50a" (UID: "880957f8-122f-48c9-b4d4-0082de01a50a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.014887 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "880957f8-122f-48c9-b4d4-0082de01a50a" (UID: "880957f8-122f-48c9-b4d4-0082de01a50a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.029340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "880957f8-122f-48c9-b4d4-0082de01a50a" (UID: "880957f8-122f-48c9-b4d4-0082de01a50a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.032725 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "880957f8-122f-48c9-b4d4-0082de01a50a" (UID: "880957f8-122f-48c9-b4d4-0082de01a50a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.042106 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-config" (OuterVolumeSpecName: "config") pod "880957f8-122f-48c9-b4d4-0082de01a50a" (UID: "880957f8-122f-48c9-b4d4-0082de01a50a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.091093 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.091128 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.091145 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdnzc\" (UniqueName: \"kubernetes.io/projected/880957f8-122f-48c9-b4d4-0082de01a50a-kube-api-access-bdnzc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.091159 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.091172 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.091185 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/880957f8-122f-48c9-b4d4-0082de01a50a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.155203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10713394-dd58-4421-b339-af18a8ab723e","Type":"ContainerStarted","Data":"d73ded76d22a6af4269b6199e118f1910662d47d40ea3e0bd01206b2c6dbf437"} Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.157311 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" event={"ID":"3dd33833-530e-4a48-9a21-c03a37d1c253","Type":"ContainerStarted","Data":"f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57"} Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.157482 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.160684 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.160746 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c9f998c-f2xqb" event={"ID":"880957f8-122f-48c9-b4d4-0082de01a50a","Type":"ContainerDied","Data":"6f732e8782d12b295b0179767dcc3befd9eae155f1a51765b6c83b3c4ce1b40f"} Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.160773 4834 scope.go:117] "RemoveContainer" containerID="2ade48ccd4fcc866aa2ae4e147b19d1fc41af6b5d6cea1dbadca0a56efc37b5b" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.182673 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" podStartSLOduration=4.182655883 podStartE2EDuration="4.182655883s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:12.171658134 +0000 UTC m=+1283.324804302" watchObservedRunningTime="2026-01-30 21:37:12.182655883 +0000 UTC m=+1283.335802021" Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.235299 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-f2xqb"] Jan 30 21:37:12 crc kubenswrapper[4834]: I0130 21:37:12.246059 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c9f998c-f2xqb"] Jan 30 21:37:13 crc kubenswrapper[4834]: I0130 21:37:13.553837 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880957f8-122f-48c9-b4d4-0082de01a50a" path="/var/lib/kubelet/pods/880957f8-122f-48c9-b4d4-0082de01a50a/volumes" Jan 30 21:37:16 crc kubenswrapper[4834]: I0130 21:37:16.237799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e523b7ff-b98d-4428-b93d-5cd803cbf1a0","Type":"ContainerStarted","Data":"cea6222f01a6097a66d1bce8214b33c5e38a7c37fa6450c2b40f7c385ec16ac6"} Jan 30 21:37:17 crc kubenswrapper[4834]: I0130 21:37:17.253433 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10713394-dd58-4421-b339-af18a8ab723e","Type":"ContainerStarted","Data":"955c7436d23f4bb1af15bd6caa812dd3428b2f90f96f08c1d83310bd3442a7ef"} Jan 30 21:37:17 crc kubenswrapper[4834]: I0130 21:37:17.253599 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-log" containerID="cri-o://d73ded76d22a6af4269b6199e118f1910662d47d40ea3e0bd01206b2c6dbf437" gracePeriod=30 Jan 30 21:37:17 crc kubenswrapper[4834]: I0130 21:37:17.253707 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-httpd" containerID="cri-o://955c7436d23f4bb1af15bd6caa812dd3428b2f90f96f08c1d83310bd3442a7ef" gracePeriod=30 Jan 30 21:37:17 crc kubenswrapper[4834]: I0130 21:37:17.275522 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.275505926 podStartE2EDuration="9.275505926s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:17.271351919 +0000 UTC m=+1288.424498087" watchObservedRunningTime="2026-01-30 21:37:17.275505926 +0000 UTC m=+1288.428652064" Jan 30 21:37:18 crc kubenswrapper[4834]: I0130 21:37:18.264597 4834 generic.go:334] "Generic (PLEG): container finished" podID="10713394-dd58-4421-b339-af18a8ab723e" containerID="955c7436d23f4bb1af15bd6caa812dd3428b2f90f96f08c1d83310bd3442a7ef" exitCode=0 Jan 30 21:37:18 crc kubenswrapper[4834]: I0130 21:37:18.264639 4834 generic.go:334] "Generic (PLEG): container finished" podID="10713394-dd58-4421-b339-af18a8ab723e" containerID="d73ded76d22a6af4269b6199e118f1910662d47d40ea3e0bd01206b2c6dbf437" exitCode=143 Jan 30 21:37:18 crc kubenswrapper[4834]: I0130 21:37:18.264659 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10713394-dd58-4421-b339-af18a8ab723e","Type":"ContainerDied","Data":"955c7436d23f4bb1af15bd6caa812dd3428b2f90f96f08c1d83310bd3442a7ef"} Jan 30 21:37:18 crc kubenswrapper[4834]: I0130 21:37:18.264701 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10713394-dd58-4421-b339-af18a8ab723e","Type":"ContainerDied","Data":"d73ded76d22a6af4269b6199e118f1910662d47d40ea3e0bd01206b2c6dbf437"} Jan 30 21:37:18 crc kubenswrapper[4834]: I0130 21:37:18.266914 4834 generic.go:334] "Generic (PLEG): container finished" podID="9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" containerID="55d405e8e2ddbc9806064fff95bbf045eabbeda236a827c41132dbdcdcaa706f" exitCode=0 Jan 30 21:37:18 crc kubenswrapper[4834]: I0130 21:37:18.266977 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24kbx" event={"ID":"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc","Type":"ContainerDied","Data":"55d405e8e2ddbc9806064fff95bbf045eabbeda236a827c41132dbdcdcaa706f"} Jan 30 21:37:19 crc kubenswrapper[4834]: I0130 21:37:19.391568 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:37:19 crc kubenswrapper[4834]: I0130 21:37:19.465002 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g4wpn"] Jan 30 21:37:19 crc kubenswrapper[4834]: I0130 21:37:19.465495 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" containerID="cri-o://711184ec698d8d0bf2405d0db4475397b2b664ce9e5c476a30b3ce5b576ef5dc" gracePeriod=10 Jan 30 21:37:19 crc kubenswrapper[4834]: E0130 21:37:19.524945 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3321163d_92e3_4a02_8700_5d403f01f247.slice/crio-711184ec698d8d0bf2405d0db4475397b2b664ce9e5c476a30b3ce5b576ef5dc.scope\": RecentStats: unable to find data in memory cache]" Jan 30 21:37:19 crc kubenswrapper[4834]: I0130 21:37:19.927603 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Jan 30 21:37:20 crc kubenswrapper[4834]: I0130 21:37:20.284742 4834 generic.go:334] "Generic (PLEG): container finished" podID="3321163d-92e3-4a02-8700-5d403f01f247" containerID="711184ec698d8d0bf2405d0db4475397b2b664ce9e5c476a30b3ce5b576ef5dc" exitCode=0 Jan 30 21:37:20 crc kubenswrapper[4834]: I0130 21:37:20.284778 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" event={"ID":"3321163d-92e3-4a02-8700-5d403f01f247","Type":"ContainerDied","Data":"711184ec698d8d0bf2405d0db4475397b2b664ce9e5c476a30b3ce5b576ef5dc"} Jan 30 21:37:24 crc kubenswrapper[4834]: I0130 21:37:24.928099 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Jan 30 21:37:27 crc kubenswrapper[4834]: I0130 21:37:27.356097 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e523b7ff-b98d-4428-b93d-5cd803cbf1a0","Type":"ContainerStarted","Data":"94969cc05f2e48f85144e89c2c40b825cca2e62fa2e6c9f6972d9eaea7409685"} Jan 30 21:37:27 crc kubenswrapper[4834]: I0130 21:37:27.356314 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-log" containerID="cri-o://cea6222f01a6097a66d1bce8214b33c5e38a7c37fa6450c2b40f7c385ec16ac6" gracePeriod=30 Jan 30 21:37:27 crc kubenswrapper[4834]: I0130 21:37:27.356350 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-httpd" containerID="cri-o://94969cc05f2e48f85144e89c2c40b825cca2e62fa2e6c9f6972d9eaea7409685" gracePeriod=30 Jan 30 21:37:27 crc kubenswrapper[4834]: I0130 21:37:27.383622 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.383599931 podStartE2EDuration="19.383599931s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:27.37784904 +0000 UTC m=+1298.530995178" watchObservedRunningTime="2026-01-30 21:37:27.383599931 +0000 UTC m=+1298.536746069" Jan 30 21:37:28 crc kubenswrapper[4834]: I0130 21:37:28.377958 4834 generic.go:334] "Generic (PLEG): container finished" podID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerID="94969cc05f2e48f85144e89c2c40b825cca2e62fa2e6c9f6972d9eaea7409685" exitCode=0 Jan 30 21:37:28 crc kubenswrapper[4834]: I0130 21:37:28.377999 4834 generic.go:334] "Generic (PLEG): container finished" podID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerID="cea6222f01a6097a66d1bce8214b33c5e38a7c37fa6450c2b40f7c385ec16ac6" exitCode=143 Jan 30 21:37:28 crc kubenswrapper[4834]: I0130 21:37:28.378023 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e523b7ff-b98d-4428-b93d-5cd803cbf1a0","Type":"ContainerDied","Data":"94969cc05f2e48f85144e89c2c40b825cca2e62fa2e6c9f6972d9eaea7409685"} Jan 30 21:37:28 crc kubenswrapper[4834]: I0130 21:37:28.378051 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e523b7ff-b98d-4428-b93d-5cd803cbf1a0","Type":"ContainerDied","Data":"cea6222f01a6097a66d1bce8214b33c5e38a7c37fa6450c2b40f7c385ec16ac6"} Jan 30 21:37:29 crc kubenswrapper[4834]: I0130 21:37:29.927671 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Jan 30 21:37:29 crc kubenswrapper[4834]: I0130 21:37:29.928026 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:37:34 crc kubenswrapper[4834]: I0130 21:37:34.161712 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:37:34 crc kubenswrapper[4834]: I0130 21:37:34.163744 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:37:34 crc kubenswrapper[4834]: I0130 21:37:34.927583 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Jan 30 21:37:36 crc kubenswrapper[4834]: E0130 21:37:36.439693 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Jan 30 21:37:36 crc kubenswrapper[4834]: E0130 21:37:36.440328 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75vlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-5v96n_openstack(814e680c-6380-4ab3-a481-2f9afe8b88ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:37:36 crc kubenswrapper[4834]: E0130 21:37:36.441705 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-5v96n" podUID="814e680c-6380-4ab3-a481-2f9afe8b88ff" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.477565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-24kbx" event={"ID":"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc","Type":"ContainerDied","Data":"51e0fe83d4b854cb3e93d7dc885b237c4f7d473ac1a1017c0afec2f5fde3d3ec"} Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.477652 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51e0fe83d4b854cb3e93d7dc885b237c4f7d473ac1a1017c0afec2f5fde3d3ec" Jan 30 21:37:36 crc kubenswrapper[4834]: E0130 21:37:36.483683 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-5v96n" podUID="814e680c-6380-4ab3-a481-2f9afe8b88ff" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.582161 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.697260 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-combined-ca-bundle\") pod \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.697338 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq8wh\" (UniqueName: \"kubernetes.io/projected/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-kube-api-access-fq8wh\") pod \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.697405 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-fernet-keys\") pod \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.697480 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-config-data\") pod \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.697524 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-credential-keys\") pod \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.697697 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-scripts\") pod \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\" (UID: \"9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc\") " Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.710223 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-scripts" (OuterVolumeSpecName: "scripts") pod "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" (UID: "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.712793 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" (UID: "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.723934 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-kube-api-access-fq8wh" (OuterVolumeSpecName: "kube-api-access-fq8wh") pod "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" (UID: "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc"). InnerVolumeSpecName "kube-api-access-fq8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.724476 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" (UID: "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.747711 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-config-data" (OuterVolumeSpecName: "config-data") pod "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" (UID: "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.749245 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" (UID: "9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.800100 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.800131 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.800141 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.800149 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq8wh\" (UniqueName: \"kubernetes.io/projected/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-kube-api-access-fq8wh\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.800158 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:36 crc kubenswrapper[4834]: I0130 21:37:36.800166 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.486598 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-24kbx" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.671164 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-24kbx"] Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.678163 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-24kbx"] Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.794734 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8m5jq"] Jan 30 21:37:37 crc kubenswrapper[4834]: E0130 21:37:37.795185 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6046ea38-25fd-4180-b390-2112e5e61ecf" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.795208 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6046ea38-25fd-4180-b390-2112e5e61ecf" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4834]: E0130 21:37:37.795231 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" containerName="keystone-bootstrap" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.795240 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" containerName="keystone-bootstrap" Jan 30 21:37:37 crc kubenswrapper[4834]: E0130 21:37:37.795251 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880957f8-122f-48c9-b4d4-0082de01a50a" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.795259 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="880957f8-122f-48c9-b4d4-0082de01a50a" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.795504 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6046ea38-25fd-4180-b390-2112e5e61ecf" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.795525 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" containerName="keystone-bootstrap" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.795540 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="880957f8-122f-48c9-b4d4-0082de01a50a" containerName="init" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.803190 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.805688 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.806039 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.806287 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.808155 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzg4z" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.809065 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.818488 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8m5jq"] Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.824921 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-credential-keys\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.825008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mt2b\" (UniqueName: \"kubernetes.io/projected/11884add-09c1-45c0-92fc-9da5ebd51af3-kube-api-access-7mt2b\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.825051 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-config-data\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.825108 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-combined-ca-bundle\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.825150 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-scripts\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.825180 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-fernet-keys\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.851110 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:37:37 crc kubenswrapper[4834]: E0130 21:37:37.862576 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 30 21:37:37 crc kubenswrapper[4834]: E0130 21:37:37.862752 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l9knn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-2ngf2_openstack(25c62e65-34f5-4f5c-83fd-9af56b711bac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:37:37 crc kubenswrapper[4834]: E0130 21:37:37.863919 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-2ngf2" podUID="25c62e65-34f5-4f5c-83fd-9af56b711bac" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.927161 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdhnm\" (UniqueName: \"kubernetes.io/projected/3321163d-92e3-4a02-8700-5d403f01f247-kube-api-access-rdhnm\") pod \"3321163d-92e3-4a02-8700-5d403f01f247\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.927631 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-dns-svc\") pod \"3321163d-92e3-4a02-8700-5d403f01f247\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.927815 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-config\") pod \"3321163d-92e3-4a02-8700-5d403f01f247\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.928050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-sb\") pod \"3321163d-92e3-4a02-8700-5d403f01f247\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.928244 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-nb\") pod \"3321163d-92e3-4a02-8700-5d403f01f247\" (UID: \"3321163d-92e3-4a02-8700-5d403f01f247\") " Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.928755 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-config-data\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.928971 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-combined-ca-bundle\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.929146 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-scripts\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.929366 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-fernet-keys\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.929739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-credential-keys\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.929972 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mt2b\" (UniqueName: \"kubernetes.io/projected/11884add-09c1-45c0-92fc-9da5ebd51af3-kube-api-access-7mt2b\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.955825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-config-data\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.959855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-fernet-keys\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.960146 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3321163d-92e3-4a02-8700-5d403f01f247-kube-api-access-rdhnm" (OuterVolumeSpecName: "kube-api-access-rdhnm") pod "3321163d-92e3-4a02-8700-5d403f01f247" (UID: "3321163d-92e3-4a02-8700-5d403f01f247"). InnerVolumeSpecName "kube-api-access-rdhnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.960673 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-credential-keys\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.966916 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-combined-ca-bundle\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.967214 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-scripts\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:37 crc kubenswrapper[4834]: I0130 21:37:37.974551 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mt2b\" (UniqueName: \"kubernetes.io/projected/11884add-09c1-45c0-92fc-9da5ebd51af3-kube-api-access-7mt2b\") pod \"keystone-bootstrap-8m5jq\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.008072 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3321163d-92e3-4a02-8700-5d403f01f247" (UID: "3321163d-92e3-4a02-8700-5d403f01f247"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.020030 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-config" (OuterVolumeSpecName: "config") pod "3321163d-92e3-4a02-8700-5d403f01f247" (UID: "3321163d-92e3-4a02-8700-5d403f01f247"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.031767 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3321163d-92e3-4a02-8700-5d403f01f247" (UID: "3321163d-92e3-4a02-8700-5d403f01f247"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.032363 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.032405 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdhnm\" (UniqueName: \"kubernetes.io/projected/3321163d-92e3-4a02-8700-5d403f01f247-kube-api-access-rdhnm\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.032418 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.032430 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.035506 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3321163d-92e3-4a02-8700-5d403f01f247" (UID: "3321163d-92e3-4a02-8700-5d403f01f247"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.134755 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3321163d-92e3-4a02-8700-5d403f01f247-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.162939 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.366215 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.371191 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.371385 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctnhx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-ftcrs_openstack(b49f42b3-c35a-4138-89e6-95f7abfa23bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.373037 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-ftcrs" podUID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.440911 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441210 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-scripts\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441256 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-httpd-run\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441287 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-config-data\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441372 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-logs\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441452 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-combined-ca-bundle\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441582 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlqpg\" (UniqueName: \"kubernetes.io/projected/10713394-dd58-4421-b339-af18a8ab723e-kube-api-access-dlqpg\") pod \"10713394-dd58-4421-b339-af18a8ab723e\" (UID: \"10713394-dd58-4421-b339-af18a8ab723e\") " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.441954 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.442161 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.442347 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-logs" (OuterVolumeSpecName: "logs") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.445348 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.445536 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10713394-dd58-4421-b339-af18a8ab723e-kube-api-access-dlqpg" (OuterVolumeSpecName: "kube-api-access-dlqpg") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "kube-api-access-dlqpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.447887 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-scripts" (OuterVolumeSpecName: "scripts") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.471504 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.502848 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" event={"ID":"3321163d-92e3-4a02-8700-5d403f01f247","Type":"ContainerDied","Data":"754287b86e5c3b741e1b5e35b824c2efa2fe1138cbcb896e9eb0cdc3f52d218a"} Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.502896 4834 scope.go:117] "RemoveContainer" containerID="711184ec698d8d0bf2405d0db4475397b2b664ce9e5c476a30b3ce5b576ef5dc" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.502910 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-g4wpn" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.506363 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-config-data" (OuterVolumeSpecName: "config-data") pod "10713394-dd58-4421-b339-af18a8ab723e" (UID: "10713394-dd58-4421-b339-af18a8ab723e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.511799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"10713394-dd58-4421-b339-af18a8ab723e","Type":"ContainerDied","Data":"b959b796163c14d2d0cc63b805d8ae5b518f527eaca84f9ff8cfe17d7e7a35ad"} Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.511953 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.516495 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-ftcrs" podUID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.516650 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-2ngf2" podUID="25c62e65-34f5-4f5c-83fd-9af56b711bac" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.538904 4834 scope.go:117] "RemoveContainer" containerID="b0fbc3a50253a07c35c85086ae5abdb6aa02109b5d971a11d59f3916069cc340" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.543963 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.543985 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.543995 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.544008 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10713394-dd58-4421-b339-af18a8ab723e-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.544017 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10713394-dd58-4421-b339-af18a8ab723e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.544028 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlqpg\" (UniqueName: \"kubernetes.io/projected/10713394-dd58-4421-b339-af18a8ab723e-kube-api-access-dlqpg\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.565053 4834 scope.go:117] "RemoveContainer" containerID="955c7436d23f4bb1af15bd6caa812dd3428b2f90f96f08c1d83310bd3442a7ef" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.591658 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g4wpn"] Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.618258 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-g4wpn"] Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.619570 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.626520 4834 scope.go:117] "RemoveContainer" containerID="d73ded76d22a6af4269b6199e118f1910662d47d40ea3e0bd01206b2c6dbf437" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.632008 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.643278 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.648065 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.651755 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.652209 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652237 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.652271 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-log" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652280 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-log" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.652300 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="init" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652307 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="init" Jan 30 21:37:38 crc kubenswrapper[4834]: E0130 21:37:38.652320 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-httpd" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652327 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-httpd" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652598 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3321163d-92e3-4a02-8700-5d403f01f247" containerName="dnsmasq-dns" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652635 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-log" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.652656 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="10713394-dd58-4421-b339-af18a8ab723e" containerName="glance-httpd" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.653853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.656861 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.657637 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.664529 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.749555 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.749847 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.749946 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m26h\" (UniqueName: \"kubernetes.io/projected/4075b406-33bb-40e3-9429-087ba19fcb32-kube-api-access-9m26h\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.750017 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.750089 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.750167 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.750428 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-logs\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.750502 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.852874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.852971 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853009 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m26h\" (UniqueName: \"kubernetes.io/projected/4075b406-33bb-40e3-9429-087ba19fcb32-kube-api-access-9m26h\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853033 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853058 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853103 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-logs\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853120 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.853775 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.855179 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.855730 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-logs\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.866170 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.866699 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.871480 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.879052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m26h\" (UniqueName: \"kubernetes.io/projected/4075b406-33bb-40e3-9429-087ba19fcb32-kube-api-access-9m26h\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.895129 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.913885 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.943227 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8m5jq"] Jan 30 21:37:38 crc kubenswrapper[4834]: I0130 21:37:38.977275 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.165430 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289489 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-config-data\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289562 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289615 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-httpd-run\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289635 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-scripts\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289727 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-combined-ca-bundle\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289762 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-logs\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.289886 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j654c\" (UniqueName: \"kubernetes.io/projected/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-kube-api-access-j654c\") pod \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\" (UID: \"e523b7ff-b98d-4428-b93d-5cd803cbf1a0\") " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.290005 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.290488 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.290518 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-logs" (OuterVolumeSpecName: "logs") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.294601 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-scripts" (OuterVolumeSpecName: "scripts") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.294987 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.295317 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-kube-api-access-j654c" (OuterVolumeSpecName: "kube-api-access-j654c") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "kube-api-access-j654c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.315524 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.336035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-config-data" (OuterVolumeSpecName: "config-data") pod "e523b7ff-b98d-4428-b93d-5cd803cbf1a0" (UID: "e523b7ff-b98d-4428-b93d-5cd803cbf1a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.391839 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j654c\" (UniqueName: \"kubernetes.io/projected/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-kube-api-access-j654c\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.391881 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.391926 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.391941 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.391954 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.391969 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e523b7ff-b98d-4428-b93d-5cd803cbf1a0-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.414125 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.492810 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.493771 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.524091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4075b406-33bb-40e3-9429-087ba19fcb32","Type":"ContainerStarted","Data":"3cf0ab03c8105f4e5c3dc75fab9cc90350e6803d93da618c86c76fec018ca38d"} Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.526184 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8m5jq" event={"ID":"11884add-09c1-45c0-92fc-9da5ebd51af3","Type":"ContainerStarted","Data":"54d881f2ae455fd50bee56d740d60633535ae77375bdadbfcff9928bbf3b2e7e"} Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.526215 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8m5jq" event={"ID":"11884add-09c1-45c0-92fc-9da5ebd51af3","Type":"ContainerStarted","Data":"54988d748eefe10ae7e45f613c068e8c2120bd7598b373400a4239a40a1707df"} Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.529450 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerStarted","Data":"8589392b4743297e8e6c586a063660bfbdedad8c1828ca48c68c4e86a25a0031"} Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.534153 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.559708 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8m5jq" podStartSLOduration=2.5596753359999997 podStartE2EDuration="2.559675336s" podCreationTimestamp="2026-01-30 21:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:39.541997749 +0000 UTC m=+1310.695143927" watchObservedRunningTime="2026-01-30 21:37:39.559675336 +0000 UTC m=+1310.712821514" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.582230 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10713394-dd58-4421-b339-af18a8ab723e" path="/var/lib/kubelet/pods/10713394-dd58-4421-b339-af18a8ab723e/volumes" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.582985 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3321163d-92e3-4a02-8700-5d403f01f247" path="/var/lib/kubelet/pods/3321163d-92e3-4a02-8700-5d403f01f247/volumes" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.585532 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc" path="/var/lib/kubelet/pods/9ffc4749-ddbc-437e-9d25-23ec8f7ac0bc/volumes" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.586687 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e523b7ff-b98d-4428-b93d-5cd803cbf1a0","Type":"ContainerDied","Data":"13dfdc5763d3289a48f26fa06257d61794a05ff28fc6739dc2347ffd087b0e26"} Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.586729 4834 scope.go:117] "RemoveContainer" containerID="94969cc05f2e48f85144e89c2c40b825cca2e62fa2e6c9f6972d9eaea7409685" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.619592 4834 scope.go:117] "RemoveContainer" containerID="cea6222f01a6097a66d1bce8214b33c5e38a7c37fa6450c2b40f7c385ec16ac6" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.629479 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.646473 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.671414 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:39 crc kubenswrapper[4834]: E0130 21:37:39.672104 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-log" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.672124 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-log" Jan 30 21:37:39 crc kubenswrapper[4834]: E0130 21:37:39.672161 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-httpd" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.672168 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-httpd" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.672329 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-log" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.672350 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" containerName="glance-httpd" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.678433 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.684694 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.687905 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.696876 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802737 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802775 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802826 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802858 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802877 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802900 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802951 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtjcf\" (UniqueName: \"kubernetes.io/projected/e39cea01-f258-49de-a89e-380cc2ccdbb1-kube-api-access-rtjcf\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.802996 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-logs\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904293 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-logs\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904384 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904426 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904630 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904680 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904754 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.904830 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtjcf\" (UniqueName: \"kubernetes.io/projected/e39cea01-f258-49de-a89e-380cc2ccdbb1-kube-api-access-rtjcf\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.905206 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.908196 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.911600 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-logs\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.915677 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.918300 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.919513 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.924055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.926014 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtjcf\" (UniqueName: \"kubernetes.io/projected/e39cea01-f258-49de-a89e-380cc2ccdbb1-kube-api-access-rtjcf\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:39 crc kubenswrapper[4834]: I0130 21:37:39.953348 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " pod="openstack/glance-default-external-api-0" Jan 30 21:37:40 crc kubenswrapper[4834]: I0130 21:37:40.000179 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:37:40 crc kubenswrapper[4834]: I0130 21:37:40.560705 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4075b406-33bb-40e3-9429-087ba19fcb32","Type":"ContainerStarted","Data":"43decc1b912af4a1c561492b82cda3a337dd6d4896c7395e9fa11b341ae505d1"} Jan 30 21:37:40 crc kubenswrapper[4834]: I0130 21:37:40.961451 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:37:41 crc kubenswrapper[4834]: I0130 21:37:41.546085 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e523b7ff-b98d-4428-b93d-5cd803cbf1a0" path="/var/lib/kubelet/pods/e523b7ff-b98d-4428-b93d-5cd803cbf1a0/volumes" Jan 30 21:37:41 crc kubenswrapper[4834]: I0130 21:37:41.574261 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4075b406-33bb-40e3-9429-087ba19fcb32","Type":"ContainerStarted","Data":"379f553bea4d38792a3b231c9d0af167a7bb371728d3a16701c761a78703280f"} Jan 30 21:37:41 crc kubenswrapper[4834]: I0130 21:37:41.576797 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e39cea01-f258-49de-a89e-380cc2ccdbb1","Type":"ContainerStarted","Data":"39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122"} Jan 30 21:37:41 crc kubenswrapper[4834]: I0130 21:37:41.576844 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e39cea01-f258-49de-a89e-380cc2ccdbb1","Type":"ContainerStarted","Data":"e023551f0f37caf7dd8dfce3bd6f3e40a5cef1c015da9916a56a5a26849cf884"} Jan 30 21:37:41 crc kubenswrapper[4834]: I0130 21:37:41.583000 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerStarted","Data":"18165d58c0adc67a62d21e1134476d8540bc6868b45199a58aed6c1660613161"} Jan 30 21:37:41 crc kubenswrapper[4834]: I0130 21:37:41.599993 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.599967008 podStartE2EDuration="3.599967008s" podCreationTimestamp="2026-01-30 21:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:41.595664737 +0000 UTC m=+1312.748810875" watchObservedRunningTime="2026-01-30 21:37:41.599967008 +0000 UTC m=+1312.753113166" Jan 30 21:37:42 crc kubenswrapper[4834]: I0130 21:37:42.614985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e39cea01-f258-49de-a89e-380cc2ccdbb1","Type":"ContainerStarted","Data":"206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789"} Jan 30 21:37:42 crc kubenswrapper[4834]: I0130 21:37:42.616767 4834 generic.go:334] "Generic (PLEG): container finished" podID="11884add-09c1-45c0-92fc-9da5ebd51af3" containerID="54d881f2ae455fd50bee56d740d60633535ae77375bdadbfcff9928bbf3b2e7e" exitCode=0 Jan 30 21:37:42 crc kubenswrapper[4834]: I0130 21:37:42.617205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8m5jq" event={"ID":"11884add-09c1-45c0-92fc-9da5ebd51af3","Type":"ContainerDied","Data":"54d881f2ae455fd50bee56d740d60633535ae77375bdadbfcff9928bbf3b2e7e"} Jan 30 21:37:43 crc kubenswrapper[4834]: I0130 21:37:43.650406 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.6503735840000004 podStartE2EDuration="4.650373584s" podCreationTimestamp="2026-01-30 21:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:43.649006155 +0000 UTC m=+1314.802152303" watchObservedRunningTime="2026-01-30 21:37:43.650373584 +0000 UTC m=+1314.803519732" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.423304 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.614006 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-fernet-keys\") pod \"11884add-09c1-45c0-92fc-9da5ebd51af3\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.614063 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-credential-keys\") pod \"11884add-09c1-45c0-92fc-9da5ebd51af3\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.614184 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-combined-ca-bundle\") pod \"11884add-09c1-45c0-92fc-9da5ebd51af3\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.614285 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-scripts\") pod \"11884add-09c1-45c0-92fc-9da5ebd51af3\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.614335 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-config-data\") pod \"11884add-09c1-45c0-92fc-9da5ebd51af3\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.614527 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mt2b\" (UniqueName: \"kubernetes.io/projected/11884add-09c1-45c0-92fc-9da5ebd51af3-kube-api-access-7mt2b\") pod \"11884add-09c1-45c0-92fc-9da5ebd51af3\" (UID: \"11884add-09c1-45c0-92fc-9da5ebd51af3\") " Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.622135 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "11884add-09c1-45c0-92fc-9da5ebd51af3" (UID: "11884add-09c1-45c0-92fc-9da5ebd51af3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.622169 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "11884add-09c1-45c0-92fc-9da5ebd51af3" (UID: "11884add-09c1-45c0-92fc-9da5ebd51af3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.622903 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11884add-09c1-45c0-92fc-9da5ebd51af3-kube-api-access-7mt2b" (OuterVolumeSpecName: "kube-api-access-7mt2b") pod "11884add-09c1-45c0-92fc-9da5ebd51af3" (UID: "11884add-09c1-45c0-92fc-9da5ebd51af3"). InnerVolumeSpecName "kube-api-access-7mt2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.636432 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-scripts" (OuterVolumeSpecName: "scripts") pod "11884add-09c1-45c0-92fc-9da5ebd51af3" (UID: "11884add-09c1-45c0-92fc-9da5ebd51af3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.653960 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8m5jq" event={"ID":"11884add-09c1-45c0-92fc-9da5ebd51af3","Type":"ContainerDied","Data":"54988d748eefe10ae7e45f613c068e8c2120bd7598b373400a4239a40a1707df"} Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.654028 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54988d748eefe10ae7e45f613c068e8c2120bd7598b373400a4239a40a1707df" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.654089 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8m5jq" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.662979 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11884add-09c1-45c0-92fc-9da5ebd51af3" (UID: "11884add-09c1-45c0-92fc-9da5ebd51af3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.672794 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-config-data" (OuterVolumeSpecName: "config-data") pod "11884add-09c1-45c0-92fc-9da5ebd51af3" (UID: "11884add-09c1-45c0-92fc-9da5ebd51af3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.717384 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.717459 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.717481 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.717498 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mt2b\" (UniqueName: \"kubernetes.io/projected/11884add-09c1-45c0-92fc-9da5ebd51af3-kube-api-access-7mt2b\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.717516 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.717531 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/11884add-09c1-45c0-92fc-9da5ebd51af3-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:45 crc kubenswrapper[4834]: I0130 21:37:45.942071 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.643032 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5dffc79858-5v9vm"] Jan 30 21:37:46 crc kubenswrapper[4834]: E0130 21:37:46.645156 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11884add-09c1-45c0-92fc-9da5ebd51af3" containerName="keystone-bootstrap" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.645262 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="11884add-09c1-45c0-92fc-9da5ebd51af3" containerName="keystone-bootstrap" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.645642 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="11884add-09c1-45c0-92fc-9da5ebd51af3" containerName="keystone-bootstrap" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.646757 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.652300 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.652740 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.653010 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.653245 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-xzg4z" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.653989 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.654044 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.660272 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5dffc79858-5v9vm"] Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.689192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerStarted","Data":"79f0b7c92b00dd743eb019b043fa64c9922636b656f0835b82b578859415fca7"} Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-config-data\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842328 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-scripts\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842358 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6hs4\" (UniqueName: \"kubernetes.io/projected/c032c534-05a3-42f0-9d76-8b4c1b317a91-kube-api-access-s6hs4\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842452 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-combined-ca-bundle\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842486 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-internal-tls-certs\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-credential-keys\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-public-tls-certs\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.842581 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-fernet-keys\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944279 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-scripts\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944374 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6hs4\" (UniqueName: \"kubernetes.io/projected/c032c534-05a3-42f0-9d76-8b4c1b317a91-kube-api-access-s6hs4\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944497 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-combined-ca-bundle\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944527 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-internal-tls-certs\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944560 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-credential-keys\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944602 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-public-tls-certs\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944687 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-fernet-keys\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.944860 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-config-data\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.951443 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-config-data\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.951703 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-internal-tls-certs\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.954136 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-public-tls-certs\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.954771 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-credential-keys\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.955120 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-scripts\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.955123 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-combined-ca-bundle\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.955449 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c032c534-05a3-42f0-9d76-8b4c1b317a91-fernet-keys\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:46 crc kubenswrapper[4834]: I0130 21:37:46.973851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6hs4\" (UniqueName: \"kubernetes.io/projected/c032c534-05a3-42f0-9d76-8b4c1b317a91-kube-api-access-s6hs4\") pod \"keystone-5dffc79858-5v9vm\" (UID: \"c032c534-05a3-42f0-9d76-8b4c1b317a91\") " pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:47 crc kubenswrapper[4834]: I0130 21:37:47.272240 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:47 crc kubenswrapper[4834]: I0130 21:37:47.823931 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5dffc79858-5v9vm"] Jan 30 21:37:47 crc kubenswrapper[4834]: W0130 21:37:47.878830 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc032c534_05a3_42f0_9d76_8b4c1b317a91.slice/crio-6f3ed06edd7352ee782865b8ab5dee4173de65ed6976c46eb319d2ffec4d44d1 WatchSource:0}: Error finding container 6f3ed06edd7352ee782865b8ab5dee4173de65ed6976c46eb319d2ffec4d44d1: Status 404 returned error can't find the container with id 6f3ed06edd7352ee782865b8ab5dee4173de65ed6976c46eb319d2ffec4d44d1 Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.709235 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5dffc79858-5v9vm" event={"ID":"c032c534-05a3-42f0-9d76-8b4c1b317a91","Type":"ContainerStarted","Data":"5e9876122e33472864512297a236fbf056364c7da0170702074a9fa6696b1782"} Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.709823 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.709839 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5dffc79858-5v9vm" event={"ID":"c032c534-05a3-42f0-9d76-8b4c1b317a91","Type":"ContainerStarted","Data":"6f3ed06edd7352ee782865b8ab5dee4173de65ed6976c46eb319d2ffec4d44d1"} Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.712547 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5v96n" event={"ID":"814e680c-6380-4ab3-a481-2f9afe8b88ff","Type":"ContainerStarted","Data":"77608a94523b85eae42a84752cfa4a13d422e2d663f81e65e453670d6c8d72b5"} Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.732044 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5dffc79858-5v9vm" podStartSLOduration=2.7320204820000002 podStartE2EDuration="2.732020482s" podCreationTimestamp="2026-01-30 21:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:48.724614774 +0000 UTC m=+1319.877760912" watchObservedRunningTime="2026-01-30 21:37:48.732020482 +0000 UTC m=+1319.885166620" Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.745776 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-5v96n" podStartSLOduration=2.738783574 podStartE2EDuration="40.745750868s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="2026-01-30 21:37:10.038805191 +0000 UTC m=+1281.191951329" lastFinishedPulling="2026-01-30 21:37:48.045772485 +0000 UTC m=+1319.198918623" observedRunningTime="2026-01-30 21:37:48.743916566 +0000 UTC m=+1319.897062704" watchObservedRunningTime="2026-01-30 21:37:48.745750868 +0000 UTC m=+1319.898897016" Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.978727 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:48 crc kubenswrapper[4834]: I0130 21:37:48.978792 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:49 crc kubenswrapper[4834]: I0130 21:37:49.026743 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:49 crc kubenswrapper[4834]: I0130 21:37:49.030772 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:49 crc kubenswrapper[4834]: I0130 21:37:49.721434 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:49 crc kubenswrapper[4834]: I0130 21:37:49.721643 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:50 crc kubenswrapper[4834]: I0130 21:37:50.001491 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:37:50 crc kubenswrapper[4834]: I0130 21:37:50.001582 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:37:50 crc kubenswrapper[4834]: I0130 21:37:50.044120 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:37:50 crc kubenswrapper[4834]: I0130 21:37:50.061021 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:37:50 crc kubenswrapper[4834]: I0130 21:37:50.730952 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:37:50 crc kubenswrapper[4834]: I0130 21:37:50.732920 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.484107 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.484561 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.489787 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.667011 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.756836 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2ngf2" event={"ID":"25c62e65-34f5-4f5c-83fd-9af56b711bac","Type":"ContainerStarted","Data":"e3c7c0a01900b2a8ac0ea0a429298d2283cec4e9adc0bca59be91e837502e5b7"} Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.756895 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:37:52 crc kubenswrapper[4834]: I0130 21:37:52.878366 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:37:53 crc kubenswrapper[4834]: I0130 21:37:53.790492 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2ngf2" podStartSLOduration=5.51220976 podStartE2EDuration="45.790472828s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="2026-01-30 21:37:09.856501347 +0000 UTC m=+1281.009647485" lastFinishedPulling="2026-01-30 21:37:50.134764415 +0000 UTC m=+1321.287910553" observedRunningTime="2026-01-30 21:37:53.780959381 +0000 UTC m=+1324.934105519" watchObservedRunningTime="2026-01-30 21:37:53.790472828 +0000 UTC m=+1324.943618986" Jan 30 21:37:54 crc kubenswrapper[4834]: I0130 21:37:54.773374 4834 generic.go:334] "Generic (PLEG): container finished" podID="814e680c-6380-4ab3-a481-2f9afe8b88ff" containerID="77608a94523b85eae42a84752cfa4a13d422e2d663f81e65e453670d6c8d72b5" exitCode=0 Jan 30 21:37:54 crc kubenswrapper[4834]: I0130 21:37:54.773467 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5v96n" event={"ID":"814e680c-6380-4ab3-a481-2f9afe8b88ff","Type":"ContainerDied","Data":"77608a94523b85eae42a84752cfa4a13d422e2d663f81e65e453670d6c8d72b5"} Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.785460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ftcrs" event={"ID":"b49f42b3-c35a-4138-89e6-95f7abfa23bb","Type":"ContainerStarted","Data":"eacb214f7980e4aed57c5a903cfebb3c7d4b92d18752d9d10186e4d6df520eef"} Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.788509 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerStarted","Data":"d7889ff57d903c0342dff71d8f76d3cbfdc6adbddbd44dbdf88bdfca8c1dfd4b"} Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.788667 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="proxy-httpd" containerID="cri-o://d7889ff57d903c0342dff71d8f76d3cbfdc6adbddbd44dbdf88bdfca8c1dfd4b" gracePeriod=30 Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.788683 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="sg-core" containerID="cri-o://79f0b7c92b00dd743eb019b043fa64c9922636b656f0835b82b578859415fca7" gracePeriod=30 Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.788731 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-notification-agent" containerID="cri-o://18165d58c0adc67a62d21e1134476d8540bc6868b45199a58aed6c1660613161" gracePeriod=30 Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.788688 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.788722 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-central-agent" containerID="cri-o://8589392b4743297e8e6c586a063660bfbdedad8c1828ca48c68c4e86a25a0031" gracePeriod=30 Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.822670 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ftcrs" podStartSLOduration=2.44227925 podStartE2EDuration="47.822633442s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="2026-01-30 21:37:09.984138454 +0000 UTC m=+1281.137284592" lastFinishedPulling="2026-01-30 21:37:55.364492646 +0000 UTC m=+1326.517638784" observedRunningTime="2026-01-30 21:37:55.807931699 +0000 UTC m=+1326.961077847" watchObservedRunningTime="2026-01-30 21:37:55.822633442 +0000 UTC m=+1326.975779610" Jan 30 21:37:55 crc kubenswrapper[4834]: I0130 21:37:55.842149 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.502955015 podStartE2EDuration="47.84213018s" podCreationTimestamp="2026-01-30 21:37:08 +0000 UTC" firstStartedPulling="2026-01-30 21:37:10.035763515 +0000 UTC m=+1281.188909653" lastFinishedPulling="2026-01-30 21:37:55.37493868 +0000 UTC m=+1326.528084818" observedRunningTime="2026-01-30 21:37:55.829780653 +0000 UTC m=+1326.982926791" watchObservedRunningTime="2026-01-30 21:37:55.84213018 +0000 UTC m=+1326.995276328" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.166779 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.277286 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-combined-ca-bundle\") pod \"814e680c-6380-4ab3-a481-2f9afe8b88ff\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.277469 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-scripts\") pod \"814e680c-6380-4ab3-a481-2f9afe8b88ff\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.277521 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75vlr\" (UniqueName: \"kubernetes.io/projected/814e680c-6380-4ab3-a481-2f9afe8b88ff-kube-api-access-75vlr\") pod \"814e680c-6380-4ab3-a481-2f9afe8b88ff\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.277555 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e680c-6380-4ab3-a481-2f9afe8b88ff-logs\") pod \"814e680c-6380-4ab3-a481-2f9afe8b88ff\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.277576 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-config-data\") pod \"814e680c-6380-4ab3-a481-2f9afe8b88ff\" (UID: \"814e680c-6380-4ab3-a481-2f9afe8b88ff\") " Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.278785 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814e680c-6380-4ab3-a481-2f9afe8b88ff-logs" (OuterVolumeSpecName: "logs") pod "814e680c-6380-4ab3-a481-2f9afe8b88ff" (UID: "814e680c-6380-4ab3-a481-2f9afe8b88ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.283851 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-scripts" (OuterVolumeSpecName: "scripts") pod "814e680c-6380-4ab3-a481-2f9afe8b88ff" (UID: "814e680c-6380-4ab3-a481-2f9afe8b88ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.283873 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814e680c-6380-4ab3-a481-2f9afe8b88ff-kube-api-access-75vlr" (OuterVolumeSpecName: "kube-api-access-75vlr") pod "814e680c-6380-4ab3-a481-2f9afe8b88ff" (UID: "814e680c-6380-4ab3-a481-2f9afe8b88ff"). InnerVolumeSpecName "kube-api-access-75vlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.307630 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "814e680c-6380-4ab3-a481-2f9afe8b88ff" (UID: "814e680c-6380-4ab3-a481-2f9afe8b88ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.310878 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-config-data" (OuterVolumeSpecName: "config-data") pod "814e680c-6380-4ab3-a481-2f9afe8b88ff" (UID: "814e680c-6380-4ab3-a481-2f9afe8b88ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.379425 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.379477 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.379491 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75vlr\" (UniqueName: \"kubernetes.io/projected/814e680c-6380-4ab3-a481-2f9afe8b88ff-kube-api-access-75vlr\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.379507 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e680c-6380-4ab3-a481-2f9afe8b88ff-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.379519 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e680c-6380-4ab3-a481-2f9afe8b88ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.807098 4834 generic.go:334] "Generic (PLEG): container finished" podID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerID="d7889ff57d903c0342dff71d8f76d3cbfdc6adbddbd44dbdf88bdfca8c1dfd4b" exitCode=0 Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.807151 4834 generic.go:334] "Generic (PLEG): container finished" podID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerID="79f0b7c92b00dd743eb019b043fa64c9922636b656f0835b82b578859415fca7" exitCode=2 Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.807174 4834 generic.go:334] "Generic (PLEG): container finished" podID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerID="8589392b4743297e8e6c586a063660bfbdedad8c1828ca48c68c4e86a25a0031" exitCode=0 Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.807219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerDied","Data":"d7889ff57d903c0342dff71d8f76d3cbfdc6adbddbd44dbdf88bdfca8c1dfd4b"} Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.807295 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerDied","Data":"79f0b7c92b00dd743eb019b043fa64c9922636b656f0835b82b578859415fca7"} Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.807321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerDied","Data":"8589392b4743297e8e6c586a063660bfbdedad8c1828ca48c68c4e86a25a0031"} Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.810210 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5v96n" event={"ID":"814e680c-6380-4ab3-a481-2f9afe8b88ff","Type":"ContainerDied","Data":"4ed7656294fd66a00ca2ff282955ec0b8054c2efa92691aeeb8554a873a9c6e5"} Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.810312 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ed7656294fd66a00ca2ff282955ec0b8054c2efa92691aeeb8554a873a9c6e5" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.810309 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5v96n" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.977294 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65cd556484-hkshq"] Jan 30 21:37:56 crc kubenswrapper[4834]: E0130 21:37:56.977856 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814e680c-6380-4ab3-a481-2f9afe8b88ff" containerName="placement-db-sync" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.977886 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="814e680c-6380-4ab3-a481-2f9afe8b88ff" containerName="placement-db-sync" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.978230 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="814e680c-6380-4ab3-a481-2f9afe8b88ff" containerName="placement-db-sync" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.979868 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.983453 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.983826 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zqf4b" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.984152 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.984560 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.984865 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 30 21:37:56 crc kubenswrapper[4834]: I0130 21:37:56.997311 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65cd556484-hkshq"] Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.095166 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-internal-tls-certs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.095332 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-scripts\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.095643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-public-tls-certs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.095828 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316f606b-e690-43aa-afb0-d8643180d92a-logs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.095900 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-combined-ca-bundle\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.096115 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-config-data\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.096209 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46shj\" (UniqueName: \"kubernetes.io/projected/316f606b-e690-43aa-afb0-d8643180d92a-kube-api-access-46shj\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.197720 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316f606b-e690-43aa-afb0-d8643180d92a-logs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.197782 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-combined-ca-bundle\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.197854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-config-data\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.197891 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46shj\" (UniqueName: \"kubernetes.io/projected/316f606b-e690-43aa-afb0-d8643180d92a-kube-api-access-46shj\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.197949 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-internal-tls-certs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.197997 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-scripts\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.198063 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-public-tls-certs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.198328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/316f606b-e690-43aa-afb0-d8643180d92a-logs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.201768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-public-tls-certs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.202511 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-combined-ca-bundle\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.201850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-internal-tls-certs\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.202564 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-scripts\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.202744 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/316f606b-e690-43aa-afb0-d8643180d92a-config-data\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.218027 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46shj\" (UniqueName: \"kubernetes.io/projected/316f606b-e690-43aa-afb0-d8643180d92a-kube-api-access-46shj\") pod \"placement-65cd556484-hkshq\" (UID: \"316f606b-e690-43aa-afb0-d8643180d92a\") " pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.312565 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.757263 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65cd556484-hkshq"] Jan 30 21:37:57 crc kubenswrapper[4834]: W0130 21:37:57.766026 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod316f606b_e690_43aa_afb0_d8643180d92a.slice/crio-c970942ac0e0802568c76d148641bc658dd2edb07d1f419677986a09bf1e6612 WatchSource:0}: Error finding container c970942ac0e0802568c76d148641bc658dd2edb07d1f419677986a09bf1e6612: Status 404 returned error can't find the container with id c970942ac0e0802568c76d148641bc658dd2edb07d1f419677986a09bf1e6612 Jan 30 21:37:57 crc kubenswrapper[4834]: I0130 21:37:57.821626 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65cd556484-hkshq" event={"ID":"316f606b-e690-43aa-afb0-d8643180d92a","Type":"ContainerStarted","Data":"c970942ac0e0802568c76d148641bc658dd2edb07d1f419677986a09bf1e6612"} Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.835914 4834 generic.go:334] "Generic (PLEG): container finished" podID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerID="18165d58c0adc67a62d21e1134476d8540bc6868b45199a58aed6c1660613161" exitCode=0 Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.835992 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerDied","Data":"18165d58c0adc67a62d21e1134476d8540bc6868b45199a58aed6c1660613161"} Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.839018 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65cd556484-hkshq" event={"ID":"316f606b-e690-43aa-afb0-d8643180d92a","Type":"ContainerStarted","Data":"f5c31e3cb1e8d31f958d49d42a11a7651d7d39a13dad1df258b7ba1f0cc8f28f"} Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.839091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65cd556484-hkshq" event={"ID":"316f606b-e690-43aa-afb0-d8643180d92a","Type":"ContainerStarted","Data":"fe935c57a4a000db758205f6989f81a51d4cb5eedeed432a399ee4b42d5ae90a"} Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.839274 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.839445 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65cd556484-hkshq" Jan 30 21:37:58 crc kubenswrapper[4834]: I0130 21:37:58.876831 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65cd556484-hkshq" podStartSLOduration=2.8768032679999997 podStartE2EDuration="2.876803268s" podCreationTimestamp="2026-01-30 21:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:37:58.866124058 +0000 UTC m=+1330.019270196" watchObservedRunningTime="2026-01-30 21:37:58.876803268 +0000 UTC m=+1330.029949446" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.314909 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.344502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-scripts\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346063 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-run-httpd\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346175 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-sg-core-conf-yaml\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346230 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-log-httpd\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346308 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djrmb\" (UniqueName: \"kubernetes.io/projected/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-kube-api-access-djrmb\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346525 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-config-data\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346881 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.346891 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.347093 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-combined-ca-bundle\") pod \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\" (UID: \"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813\") " Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.347883 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.347914 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.352354 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-scripts" (OuterVolumeSpecName: "scripts") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.354426 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-kube-api-access-djrmb" (OuterVolumeSpecName: "kube-api-access-djrmb") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "kube-api-access-djrmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.387020 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.449526 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.449558 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djrmb\" (UniqueName: \"kubernetes.io/projected/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-kube-api-access-djrmb\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.449570 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.457911 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.475566 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-config-data" (OuterVolumeSpecName: "config-data") pod "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" (UID: "bc9c2a7c-f743-4351-b5b0-81ce8ef6b813"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.552011 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.552064 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.856147 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc9c2a7c-f743-4351-b5b0-81ce8ef6b813","Type":"ContainerDied","Data":"ae9063fcf9e1456c6400a6874fad5c401ad0139fbe25df31cd4fd667c0124225"} Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.856191 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.856232 4834 scope.go:117] "RemoveContainer" containerID="d7889ff57d903c0342dff71d8f76d3cbfdc6adbddbd44dbdf88bdfca8c1dfd4b" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.890014 4834 scope.go:117] "RemoveContainer" containerID="79f0b7c92b00dd743eb019b043fa64c9922636b656f0835b82b578859415fca7" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.905012 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.924534 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.931851 4834 scope.go:117] "RemoveContainer" containerID="18165d58c0adc67a62d21e1134476d8540bc6868b45199a58aed6c1660613161" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935098 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:59 crc kubenswrapper[4834]: E0130 21:37:59.935539 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="proxy-httpd" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935559 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="proxy-httpd" Jan 30 21:37:59 crc kubenswrapper[4834]: E0130 21:37:59.935573 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-central-agent" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935580 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-central-agent" Jan 30 21:37:59 crc kubenswrapper[4834]: E0130 21:37:59.935613 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="sg-core" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935620 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="sg-core" Jan 30 21:37:59 crc kubenswrapper[4834]: E0130 21:37:59.935665 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-notification-agent" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935676 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-notification-agent" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935903 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="proxy-httpd" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935922 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-central-agent" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935938 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="sg-core" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.935954 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" containerName="ceilometer-notification-agent" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.937989 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.941802 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.941981 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.956032 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:37:59 crc kubenswrapper[4834]: I0130 21:37:59.976778 4834 scope.go:117] "RemoveContainer" containerID="8589392b4743297e8e6c586a063660bfbdedad8c1828ca48c68c4e86a25a0031" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.032506 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:00 crc kubenswrapper[4834]: E0130 21:38:00.033570 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-s7mmv log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data kube-api-access-s7mmv log-httpd run-httpd sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="47879756-2a1f-4bc3-8e7c-9ab4563b9605" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.061797 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.061874 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-config-data\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.061909 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.061931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-log-httpd\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.062005 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-scripts\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.062067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-run-httpd\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.062098 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mmv\" (UniqueName: \"kubernetes.io/projected/47879756-2a1f-4bc3-8e7c-9ab4563b9605-kube-api-access-s7mmv\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164024 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-run-httpd\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164538 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7mmv\" (UniqueName: \"kubernetes.io/projected/47879756-2a1f-4bc3-8e7c-9ab4563b9605-kube-api-access-s7mmv\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164593 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164689 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-config-data\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164738 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164784 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-log-httpd\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.164856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-scripts\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.165072 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-run-httpd\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.166006 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-log-httpd\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.169855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.170328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.170579 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-scripts\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.172081 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-config-data\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.191869 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7mmv\" (UniqueName: \"kubernetes.io/projected/47879756-2a1f-4bc3-8e7c-9ab4563b9605-kube-api-access-s7mmv\") pod \"ceilometer-0\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.868606 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.887961 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979185 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-combined-ca-bundle\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979454 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-scripts\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979514 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7mmv\" (UniqueName: \"kubernetes.io/projected/47879756-2a1f-4bc3-8e7c-9ab4563b9605-kube-api-access-s7mmv\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979577 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-run-httpd\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979764 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-log-httpd\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979854 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-config-data\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.979912 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-sg-core-conf-yaml\") pod \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\" (UID: \"47879756-2a1f-4bc3-8e7c-9ab4563b9605\") " Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.981154 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.981764 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.988715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.989523 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47879756-2a1f-4bc3-8e7c-9ab4563b9605-kube-api-access-s7mmv" (OuterVolumeSpecName: "kube-api-access-s7mmv") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "kube-api-access-s7mmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.989568 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-scripts" (OuterVolumeSpecName: "scripts") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.992621 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:00 crc kubenswrapper[4834]: I0130 21:38:00.996513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-config-data" (OuterVolumeSpecName: "config-data") pod "47879756-2a1f-4bc3-8e7c-9ab4563b9605" (UID: "47879756-2a1f-4bc3-8e7c-9ab4563b9605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.082781 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.083238 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.083268 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7mmv\" (UniqueName: \"kubernetes.io/projected/47879756-2a1f-4bc3-8e7c-9ab4563b9605-kube-api-access-s7mmv\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.083286 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.083298 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47879756-2a1f-4bc3-8e7c-9ab4563b9605-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.083311 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.083325 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47879756-2a1f-4bc3-8e7c-9ab4563b9605-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.543791 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9c2a7c-f743-4351-b5b0-81ce8ef6b813" path="/var/lib/kubelet/pods/bc9c2a7c-f743-4351-b5b0-81ce8ef6b813/volumes" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.877014 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.954238 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.965979 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:01 crc kubenswrapper[4834]: I0130 21:38:01.996618 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.000759 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.004073 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.004228 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.008915 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.105470 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-log-httpd\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.105919 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.106093 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2hcn\" (UniqueName: \"kubernetes.io/projected/d5b5c426-8ce0-4431-8f98-a18bc07163ff-kube-api-access-w2hcn\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.106223 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-config-data\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.106346 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-scripts\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.106660 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.106816 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-run-httpd\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.208833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-log-httpd\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.208914 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.208950 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2hcn\" (UniqueName: \"kubernetes.io/projected/d5b5c426-8ce0-4431-8f98-a18bc07163ff-kube-api-access-w2hcn\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.208968 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-config-data\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.208988 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-scripts\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.209018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.209042 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-run-httpd\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.209574 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-run-httpd\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.209669 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-log-httpd\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.214928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-scripts\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.215274 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.215051 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-config-data\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.217343 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.238973 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2hcn\" (UniqueName: \"kubernetes.io/projected/d5b5c426-8ce0-4431-8f98-a18bc07163ff-kube-api-access-w2hcn\") pod \"ceilometer-0\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.326780 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.809083 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:02 crc kubenswrapper[4834]: W0130 21:38:02.824269 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5b5c426_8ce0_4431_8f98_a18bc07163ff.slice/crio-625489fb6106ae3063054779db61642f62845e171c658160c9ace9b6a61731e2 WatchSource:0}: Error finding container 625489fb6106ae3063054779db61642f62845e171c658160c9ace9b6a61731e2: Status 404 returned error can't find the container with id 625489fb6106ae3063054779db61642f62845e171c658160c9ace9b6a61731e2 Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.891694 4834 generic.go:334] "Generic (PLEG): container finished" podID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" containerID="eacb214f7980e4aed57c5a903cfebb3c7d4b92d18752d9d10186e4d6df520eef" exitCode=0 Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.891785 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ftcrs" event={"ID":"b49f42b3-c35a-4138-89e6-95f7abfa23bb","Type":"ContainerDied","Data":"eacb214f7980e4aed57c5a903cfebb3c7d4b92d18752d9d10186e4d6df520eef"} Jan 30 21:38:02 crc kubenswrapper[4834]: I0130 21:38:02.907780 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerStarted","Data":"625489fb6106ae3063054779db61642f62845e171c658160c9ace9b6a61731e2"} Jan 30 21:38:03 crc kubenswrapper[4834]: I0130 21:38:03.545631 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47879756-2a1f-4bc3-8e7c-9ab4563b9605" path="/var/lib/kubelet/pods/47879756-2a1f-4bc3-8e7c-9ab4563b9605/volumes" Jan 30 21:38:03 crc kubenswrapper[4834]: I0130 21:38:03.918558 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerStarted","Data":"b5253a399f45f16b3687052f9a2ec21dfa1af7be25a435f31860dcc04d837d78"} Jan 30 21:38:03 crc kubenswrapper[4834]: I0130 21:38:03.920930 4834 generic.go:334] "Generic (PLEG): container finished" podID="bb10ef46-028a-4e54-a587-4843dae377f9" containerID="3e7ec2eafede4e5b75a15bc96c461c617bfe6bc2a6ddad530a1360ad09a8f347" exitCode=0 Jan 30 21:38:03 crc kubenswrapper[4834]: I0130 21:38:03.921019 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvxl7" event={"ID":"bb10ef46-028a-4e54-a587-4843dae377f9","Type":"ContainerDied","Data":"3e7ec2eafede4e5b75a15bc96c461c617bfe6bc2a6ddad530a1360ad09a8f347"} Jan 30 21:38:03 crc kubenswrapper[4834]: I0130 21:38:03.923092 4834 generic.go:334] "Generic (PLEG): container finished" podID="25c62e65-34f5-4f5c-83fd-9af56b711bac" containerID="e3c7c0a01900b2a8ac0ea0a429298d2283cec4e9adc0bca59be91e837502e5b7" exitCode=0 Jan 30 21:38:03 crc kubenswrapper[4834]: I0130 21:38:03.923150 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2ngf2" event={"ID":"25c62e65-34f5-4f5c-83fd-9af56b711bac","Type":"ContainerDied","Data":"e3c7c0a01900b2a8ac0ea0a429298d2283cec4e9adc0bca59be91e837502e5b7"} Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.160444 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.160831 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.232193 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.362341 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-combined-ca-bundle\") pod \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.362446 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctnhx\" (UniqueName: \"kubernetes.io/projected/b49f42b3-c35a-4138-89e6-95f7abfa23bb-kube-api-access-ctnhx\") pod \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.362501 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-db-sync-config-data\") pod \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\" (UID: \"b49f42b3-c35a-4138-89e6-95f7abfa23bb\") " Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.368198 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49f42b3-c35a-4138-89e6-95f7abfa23bb-kube-api-access-ctnhx" (OuterVolumeSpecName: "kube-api-access-ctnhx") pod "b49f42b3-c35a-4138-89e6-95f7abfa23bb" (UID: "b49f42b3-c35a-4138-89e6-95f7abfa23bb"). InnerVolumeSpecName "kube-api-access-ctnhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.369054 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b49f42b3-c35a-4138-89e6-95f7abfa23bb" (UID: "b49f42b3-c35a-4138-89e6-95f7abfa23bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.390419 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b49f42b3-c35a-4138-89e6-95f7abfa23bb" (UID: "b49f42b3-c35a-4138-89e6-95f7abfa23bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.467216 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.467495 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctnhx\" (UniqueName: \"kubernetes.io/projected/b49f42b3-c35a-4138-89e6-95f7abfa23bb-kube-api-access-ctnhx\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.467613 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b49f42b3-c35a-4138-89e6-95f7abfa23bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.938625 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerStarted","Data":"b3d780c50a321a5a6f2fcd20f56ec4c15b04a7bb0d4028c54a47657da158a48d"} Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.945029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ftcrs" event={"ID":"b49f42b3-c35a-4138-89e6-95f7abfa23bb","Type":"ContainerDied","Data":"71659e5bda75e5aa6bfe9b4af640f68056e8af7ad8a3a77482d2e50e93c88217"} Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.945077 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71659e5bda75e5aa6bfe9b4af640f68056e8af7ad8a3a77482d2e50e93c88217" Jan 30 21:38:04 crc kubenswrapper[4834]: I0130 21:38:04.945454 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ftcrs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.243445 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-fffb48c8c-724zs"] Jan 30 21:38:05 crc kubenswrapper[4834]: E0130 21:38:05.244093 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" containerName="barbican-db-sync" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.244109 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" containerName="barbican-db-sync" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.244338 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" containerName="barbican-db-sync" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.245236 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.247606 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fnmp7" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.247803 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.248813 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.249444 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.259976 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fffb48c8c-724zs"] Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282403 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c62e65-34f5-4f5c-83fd-9af56b711bac-etc-machine-id\") pod \"25c62e65-34f5-4f5c-83fd-9af56b711bac\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282450 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-combined-ca-bundle\") pod \"25c62e65-34f5-4f5c-83fd-9af56b711bac\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-db-sync-config-data\") pod \"25c62e65-34f5-4f5c-83fd-9af56b711bac\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282529 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9knn\" (UniqueName: \"kubernetes.io/projected/25c62e65-34f5-4f5c-83fd-9af56b711bac-kube-api-access-l9knn\") pod \"25c62e65-34f5-4f5c-83fd-9af56b711bac\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282565 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-config-data\") pod \"25c62e65-34f5-4f5c-83fd-9af56b711bac\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282636 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-scripts\") pod \"25c62e65-34f5-4f5c-83fd-9af56b711bac\" (UID: \"25c62e65-34f5-4f5c-83fd-9af56b711bac\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282919 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e44120e0-1332-4341-8638-917c3f2f1760-logs\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-config-data-custom\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.282997 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4dgl\" (UniqueName: \"kubernetes.io/projected/e44120e0-1332-4341-8638-917c3f2f1760-kube-api-access-q4dgl\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.283061 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-config-data\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.283090 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-combined-ca-bundle\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.285005 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25c62e65-34f5-4f5c-83fd-9af56b711bac-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25c62e65-34f5-4f5c-83fd-9af56b711bac" (UID: "25c62e65-34f5-4f5c-83fd-9af56b711bac"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.290189 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-scripts" (OuterVolumeSpecName: "scripts") pod "25c62e65-34f5-4f5c-83fd-9af56b711bac" (UID: "25c62e65-34f5-4f5c-83fd-9af56b711bac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.294712 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6dd649c8b-p8wq8"] Jan 30 21:38:05 crc kubenswrapper[4834]: E0130 21:38:05.295289 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c62e65-34f5-4f5c-83fd-9af56b711bac" containerName="cinder-db-sync" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.295305 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c62e65-34f5-4f5c-83fd-9af56b711bac" containerName="cinder-db-sync" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.295551 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c62e65-34f5-4f5c-83fd-9af56b711bac" containerName="cinder-db-sync" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.298032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.300477 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.302808 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "25c62e65-34f5-4f5c-83fd-9af56b711bac" (UID: "25c62e65-34f5-4f5c-83fd-9af56b711bac"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.333781 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c62e65-34f5-4f5c-83fd-9af56b711bac-kube-api-access-l9knn" (OuterVolumeSpecName: "kube-api-access-l9knn") pod "25c62e65-34f5-4f5c-83fd-9af56b711bac" (UID: "25c62e65-34f5-4f5c-83fd-9af56b711bac"). InnerVolumeSpecName "kube-api-access-l9knn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.385999 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-config-data\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.387905 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-combined-ca-bundle\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.387967 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp825\" (UniqueName: \"kubernetes.io/projected/a7d82a62-7407-4012-9f92-8b0abe1afa08-kube-api-access-rp825\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388012 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-combined-ca-bundle\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388046 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-config-data\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e44120e0-1332-4341-8638-917c3f2f1760-logs\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-config-data-custom\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d82a62-7407-4012-9f92-8b0abe1afa08-logs\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388174 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-config-data-custom\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388203 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4dgl\" (UniqueName: \"kubernetes.io/projected/e44120e0-1332-4341-8638-917c3f2f1760-kube-api-access-q4dgl\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388319 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25c62e65-34f5-4f5c-83fd-9af56b711bac-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388331 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388341 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9knn\" (UniqueName: \"kubernetes.io/projected/25c62e65-34f5-4f5c-83fd-9af56b711bac-kube-api-access-l9knn\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.388358 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.390178 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e44120e0-1332-4341-8638-917c3f2f1760-logs\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.404900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-config-data-custom\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.424776 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-config-data\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.429705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e44120e0-1332-4341-8638-917c3f2f1760-combined-ca-bundle\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.436822 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4dgl\" (UniqueName: \"kubernetes.io/projected/e44120e0-1332-4341-8638-917c3f2f1760-kube-api-access-q4dgl\") pod \"barbican-worker-fffb48c8c-724zs\" (UID: \"e44120e0-1332-4341-8638-917c3f2f1760\") " pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.463523 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6dd649c8b-p8wq8"] Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.491785 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp825\" (UniqueName: \"kubernetes.io/projected/a7d82a62-7407-4012-9f92-8b0abe1afa08-kube-api-access-rp825\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.491861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-combined-ca-bundle\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.491904 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-config-data\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.491968 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d82a62-7407-4012-9f92-8b0abe1afa08-logs\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.492016 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-config-data-custom\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.494991 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7d82a62-7407-4012-9f92-8b0abe1afa08-logs\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.495080 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25c62e65-34f5-4f5c-83fd-9af56b711bac" (UID: "25c62e65-34f5-4f5c-83fd-9af56b711bac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.497176 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-config-data-custom\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.500507 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-config-data\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.511497 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp825\" (UniqueName: \"kubernetes.io/projected/a7d82a62-7407-4012-9f92-8b0abe1afa08-kube-api-access-rp825\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.511497 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d82a62-7407-4012-9f92-8b0abe1afa08-combined-ca-bundle\") pod \"barbican-keystone-listener-6dd649c8b-p8wq8\" (UID: \"a7d82a62-7407-4012-9f92-8b0abe1afa08\") " pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.532512 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-qsvxj"] Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.536646 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.554660 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-config-data" (OuterVolumeSpecName: "config-data") pod "25c62e65-34f5-4f5c-83fd-9af56b711bac" (UID: "25c62e65-34f5-4f5c-83fd-9af56b711bac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.582379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fffb48c8c-724zs" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.594164 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-qsvxj"] Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.594192 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76bf79b888-r57rl"] Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.595681 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76bf79b888-r57rl"] Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.595755 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.601074 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.601405 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.601468 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-config\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.601550 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.601696 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s48c\" (UniqueName: \"kubernetes.io/projected/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-kube-api-access-8s48c\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.602096 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.602695 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.602904 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.602925 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c62e65-34f5-4f5c-83fd-9af56b711bac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.607935 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.631557 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.703873 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-config\") pod \"bb10ef46-028a-4e54-a587-4843dae377f9\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.703998 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xc49\" (UniqueName: \"kubernetes.io/projected/bb10ef46-028a-4e54-a587-4843dae377f9-kube-api-access-8xc49\") pod \"bb10ef46-028a-4e54-a587-4843dae377f9\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.704066 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-combined-ca-bundle\") pod \"bb10ef46-028a-4e54-a587-4843dae377f9\" (UID: \"bb10ef46-028a-4e54-a587-4843dae377f9\") " Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.704321 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42b9899-e1fe-422f-af90-b08348a78694-logs\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.704346 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.704363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data-custom\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.704387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705386 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-config\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705479 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s48c\" (UniqueName: \"kubernetes.io/projected/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-kube-api-access-8s48c\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705506 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705562 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-combined-ca-bundle\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.705583 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssdg\" (UniqueName: \"kubernetes.io/projected/e42b9899-e1fe-422f-af90-b08348a78694-kube-api-access-kssdg\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.706034 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-sb\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.708180 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-svc\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.708723 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-config\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.712665 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb10ef46-028a-4e54-a587-4843dae377f9-kube-api-access-8xc49" (OuterVolumeSpecName: "kube-api-access-8xc49") pod "bb10ef46-028a-4e54-a587-4843dae377f9" (UID: "bb10ef46-028a-4e54-a587-4843dae377f9"). InnerVolumeSpecName "kube-api-access-8xc49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.714197 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-nb\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.716246 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-swift-storage-0\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.726097 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s48c\" (UniqueName: \"kubernetes.io/projected/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-kube-api-access-8s48c\") pod \"dnsmasq-dns-6d66f584d7-qsvxj\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.748525 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-config" (OuterVolumeSpecName: "config") pod "bb10ef46-028a-4e54-a587-4843dae377f9" (UID: "bb10ef46-028a-4e54-a587-4843dae377f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.748582 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb10ef46-028a-4e54-a587-4843dae377f9" (UID: "bb10ef46-028a-4e54-a587-4843dae377f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-combined-ca-bundle\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807088 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kssdg\" (UniqueName: \"kubernetes.io/projected/e42b9899-e1fe-422f-af90-b08348a78694-kube-api-access-kssdg\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807203 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42b9899-e1fe-422f-af90-b08348a78694-logs\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data-custom\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807375 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807402 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb10ef46-028a-4e54-a587-4843dae377f9-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.807428 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xc49\" (UniqueName: \"kubernetes.io/projected/bb10ef46-028a-4e54-a587-4843dae377f9-kube-api-access-8xc49\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.808529 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42b9899-e1fe-422f-af90-b08348a78694-logs\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.811802 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data-custom\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.813648 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-combined-ca-bundle\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.816090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.825500 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssdg\" (UniqueName: \"kubernetes.io/projected/e42b9899-e1fe-422f-af90-b08348a78694-kube-api-access-kssdg\") pod \"barbican-api-76bf79b888-r57rl\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.936764 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.949047 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.957898 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2ngf2" event={"ID":"25c62e65-34f5-4f5c-83fd-9af56b711bac","Type":"ContainerDied","Data":"d0ffcec7bd3450452b74c854c93b31675624d5b17785819d17b823e8f4a7214c"} Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.957940 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ffcec7bd3450452b74c854c93b31675624d5b17785819d17b823e8f4a7214c" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.958010 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2ngf2" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.965459 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerStarted","Data":"1e37564284014c67f25364cd245b64f6f167b667a5d1d6abb7b789f79149144d"} Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.970308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-gvxl7" event={"ID":"bb10ef46-028a-4e54-a587-4843dae377f9","Type":"ContainerDied","Data":"3611d6a458799e806d5c03d2eff95ca2f023a02330403bf3278b030797f114a6"} Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.970344 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3611d6a458799e806d5c03d2eff95ca2f023a02330403bf3278b030797f114a6" Jan 30 21:38:05 crc kubenswrapper[4834]: I0130 21:38:05.970414 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-gvxl7" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.077523 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fffb48c8c-724zs"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.100463 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6dd649c8b-p8wq8"] Jan 30 21:38:06 crc kubenswrapper[4834]: W0130 21:38:06.102617 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d82a62_7407_4012_9f92_8b0abe1afa08.slice/crio-3f9365ebc65a6a24c6cd6f33c4ade5389a5a6af746c52c20c381c91b1636bf1c WatchSource:0}: Error finding container 3f9365ebc65a6a24c6cd6f33c4ade5389a5a6af746c52c20c381c91b1636bf1c: Status 404 returned error can't find the container with id 3f9365ebc65a6a24c6cd6f33c4ade5389a5a6af746c52c20c381c91b1636bf1c Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.305332 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-qsvxj"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.369872 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-pw9sj"] Jan 30 21:38:06 crc kubenswrapper[4834]: E0130 21:38:06.370244 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb10ef46-028a-4e54-a587-4843dae377f9" containerName="neutron-db-sync" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.370263 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb10ef46-028a-4e54-a587-4843dae377f9" containerName="neutron-db-sync" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.370457 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb10ef46-028a-4e54-a587-4843dae377f9" containerName="neutron-db-sync" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.376542 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.425094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-svc\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.425144 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fg4\" (UniqueName: \"kubernetes.io/projected/a75259f2-d600-45f5-b048-12af33294954-kube-api-access-v5fg4\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.425220 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-config\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.425324 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.425381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.425454 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.426468 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-pw9sj"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.483338 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6688b77c56-lfnbd"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.484823 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.495362 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.495552 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-9tk2h" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.495652 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.496216 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528827 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528848 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-config\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528874 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-ovndb-tls-certs\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-svc\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528960 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fg4\" (UniqueName: \"kubernetes.io/projected/a75259f2-d600-45f5-b048-12af33294954-kube-api-access-v5fg4\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.528976 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424j5\" (UniqueName: \"kubernetes.io/projected/0067b767-3e5f-41d1-ba56-d64799b81a8c-kube-api-access-424j5\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.529011 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-combined-ca-bundle\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.529027 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-httpd-config\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.529071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-config\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.529154 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.529766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.529890 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.530277 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.531431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-config\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.546304 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.547109 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-svc\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.547810 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.560955 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.561065 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.561112 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.562226 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-29w88" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.583105 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fg4\" (UniqueName: \"kubernetes.io/projected/a75259f2-d600-45f5-b048-12af33294954-kube-api-access-v5fg4\") pod \"dnsmasq-dns-688c87cc99-pw9sj\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.590031 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6688b77c56-lfnbd"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.623452 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.631172 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.631778 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-424j5\" (UniqueName: \"kubernetes.io/projected/0067b767-3e5f-41d1-ba56-d64799b81a8c-kube-api-access-424j5\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.631836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-combined-ca-bundle\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.631858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-httpd-config\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.631890 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-scripts\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.632183 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.632230 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dttrk\" (UniqueName: \"kubernetes.io/projected/24972395-443a-40fc-a82c-e62e4f0e2192-kube-api-access-dttrk\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.632250 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.632328 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24972395-443a-40fc-a82c-e62e4f0e2192-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.632408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-config\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.632439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-ovndb-tls-certs\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.650144 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-ovndb-tls-certs\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.653008 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-combined-ca-bundle\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.660202 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-httpd-config\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.673185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-config\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.684791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-424j5\" (UniqueName: \"kubernetes.io/projected/0067b767-3e5f-41d1-ba56-d64799b81a8c-kube-api-access-424j5\") pod \"neutron-6688b77c56-lfnbd\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.685056 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-pw9sj"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.689317 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.700451 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7p2sv"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.701933 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735462 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735526 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dttrk\" (UniqueName: \"kubernetes.io/projected/24972395-443a-40fc-a82c-e62e4f0e2192-kube-api-access-dttrk\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735572 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735617 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24972395-443a-40fc-a82c-e62e4f0e2192-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735669 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735693 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-config\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735727 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggct2\" (UniqueName: \"kubernetes.io/projected/f94b59b7-cd95-45e4-ac51-213483a8cd62-kube-api-access-ggct2\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735782 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.735825 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-scripts\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.736593 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24972395-443a-40fc-a82c-e62e4f0e2192-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.742985 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.750870 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-scripts\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.753906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.756479 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.769650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dttrk\" (UniqueName: \"kubernetes.io/projected/24972395-443a-40fc-a82c-e62e4f0e2192-kube-api-access-dttrk\") pod \"cinder-scheduler-0\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.771543 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.786823 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7p2sv"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.836832 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.839563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.839690 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.839740 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.839775 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.839801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-config\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.839839 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggct2\" (UniqueName: \"kubernetes.io/projected/f94b59b7-cd95-45e4-ac51-213483a8cd62-kube-api-access-ggct2\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.840464 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.840984 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.841512 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.842211 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-config\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.848735 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.850366 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: W0130 21:38:06.859282 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfc4a01_3fc1_4360_bd1a_3643d5da2b05.slice/crio-9bf9d959f9948be1c444ae5fad79c9fb75a4224bf8859c5e91e35ad507693f0f WatchSource:0}: Error finding container 9bf9d959f9948be1c444ae5fad79c9fb75a4224bf8859c5e91e35ad507693f0f: Status 404 returned error can't find the container with id 9bf9d959f9948be1c444ae5fad79c9fb75a4224bf8859c5e91e35ad507693f0f Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.859555 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.860041 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.863461 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.871381 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggct2\" (UniqueName: \"kubernetes.io/projected/f94b59b7-cd95-45e4-ac51-213483a8cd62-kube-api-access-ggct2\") pod \"dnsmasq-dns-6bb4fc677f-7p2sv\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:06 crc kubenswrapper[4834]: W0130 21:38:06.913371 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42b9899_e1fe_422f_af90_b08348a78694.slice/crio-7c88b808af391a25a66e501c1f354a167ff7ffdbadfe3e3352968d40aeabf633 WatchSource:0}: Error finding container 7c88b808af391a25a66e501c1f354a167ff7ffdbadfe3e3352968d40aeabf633: Status 404 returned error can't find the container with id 7c88b808af391a25a66e501c1f354a167ff7ffdbadfe3e3352968d40aeabf633 Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.913836 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-qsvxj"] Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.941816 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bp4s\" (UniqueName: \"kubernetes.io/projected/895eb736-c321-44be-b011-4b381fad441d-kube-api-access-5bp4s\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.942191 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895eb736-c321-44be-b011-4b381fad441d-logs\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.942240 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.942305 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/895eb736-c321-44be-b011-4b381fad441d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.942354 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data-custom\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.942505 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.942611 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-scripts\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:06 crc kubenswrapper[4834]: I0130 21:38:06.943345 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76bf79b888-r57rl"] Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.017507 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76bf79b888-r57rl" event={"ID":"e42b9899-e1fe-422f-af90-b08348a78694","Type":"ContainerStarted","Data":"7c88b808af391a25a66e501c1f354a167ff7ffdbadfe3e3352968d40aeabf633"} Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.018937 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" event={"ID":"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05","Type":"ContainerStarted","Data":"9bf9d959f9948be1c444ae5fad79c9fb75a4224bf8859c5e91e35ad507693f0f"} Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.021657 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fffb48c8c-724zs" event={"ID":"e44120e0-1332-4341-8638-917c3f2f1760","Type":"ContainerStarted","Data":"1ee5493d28e74927e7eac62c2b33724cda9c7e2b657540ae8d20600603a4441c"} Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.022766 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" event={"ID":"a7d82a62-7407-4012-9f92-8b0abe1afa08","Type":"ContainerStarted","Data":"3f9365ebc65a6a24c6cd6f33c4ade5389a5a6af746c52c20c381c91b1636bf1c"} Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044637 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data-custom\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044724 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044753 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-scripts\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bp4s\" (UniqueName: \"kubernetes.io/projected/895eb736-c321-44be-b011-4b381fad441d-kube-api-access-5bp4s\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044832 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895eb736-c321-44be-b011-4b381fad441d-logs\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044848 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044882 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/895eb736-c321-44be-b011-4b381fad441d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.044976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/895eb736-c321-44be-b011-4b381fad441d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.045777 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895eb736-c321-44be-b011-4b381fad441d-logs\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.053009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.054731 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.064643 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-scripts\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.065099 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data-custom\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.083897 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.093882 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bp4s\" (UniqueName: \"kubernetes.io/projected/895eb736-c321-44be-b011-4b381fad441d-kube-api-access-5bp4s\") pod \"cinder-api-0\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.186387 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.225349 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-pw9sj"] Jan 30 21:38:07 crc kubenswrapper[4834]: W0130 21:38:07.234724 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda75259f2_d600_45f5_b048_12af33294954.slice/crio-6a885f96db26f8c81352d6b1745f130d1a09aace2f3d3388477ceb20373d0578 WatchSource:0}: Error finding container 6a885f96db26f8c81352d6b1745f130d1a09aace2f3d3388477ceb20373d0578: Status 404 returned error can't find the container with id 6a885f96db26f8c81352d6b1745f130d1a09aace2f3d3388477ceb20373d0578 Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.357128 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.555574 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6688b77c56-lfnbd"] Jan 30 21:38:07 crc kubenswrapper[4834]: W0130 21:38:07.572133 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0067b767_3e5f_41d1_ba56_d64799b81a8c.slice/crio-4612a3bba0651f77622deea299f7faf63296bf37d6159fbd1b12258b776b22f0 WatchSource:0}: Error finding container 4612a3bba0651f77622deea299f7faf63296bf37d6159fbd1b12258b776b22f0: Status 404 returned error can't find the container with id 4612a3bba0651f77622deea299f7faf63296bf37d6159fbd1b12258b776b22f0 Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.596648 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7p2sv"] Jan 30 21:38:07 crc kubenswrapper[4834]: I0130 21:38:07.713543 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:07 crc kubenswrapper[4834]: W0130 21:38:07.730135 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod895eb736_c321_44be_b011_4b381fad441d.slice/crio-40943534bafd2d35f5ac2e96af6555d38d9c1a51ab334ca6c8829d5e9e237fb3 WatchSource:0}: Error finding container 40943534bafd2d35f5ac2e96af6555d38d9c1a51ab334ca6c8829d5e9e237fb3: Status 404 returned error can't find the container with id 40943534bafd2d35f5ac2e96af6555d38d9c1a51ab334ca6c8829d5e9e237fb3 Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.034782 4834 generic.go:334] "Generic (PLEG): container finished" podID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerID="75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed" exitCode=0 Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.034847 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" event={"ID":"f94b59b7-cd95-45e4-ac51-213483a8cd62","Type":"ContainerDied","Data":"75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.034874 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" event={"ID":"f94b59b7-cd95-45e4-ac51-213483a8cd62","Type":"ContainerStarted","Data":"11dabcceb331518922d924ac4dfc049e0f2376559c758edbae3e23b54cba2851"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.042656 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"895eb736-c321-44be-b011-4b381fad441d","Type":"ContainerStarted","Data":"40943534bafd2d35f5ac2e96af6555d38d9c1a51ab334ca6c8829d5e9e237fb3"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.049723 4834 generic.go:334] "Generic (PLEG): container finished" podID="a75259f2-d600-45f5-b048-12af33294954" containerID="4fd068ff27a25728928a84b04639808cadce5358b61874bb7852e5c3bd6de430" exitCode=0 Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.049821 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" event={"ID":"a75259f2-d600-45f5-b048-12af33294954","Type":"ContainerDied","Data":"4fd068ff27a25728928a84b04639808cadce5358b61874bb7852e5c3bd6de430"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.049854 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" event={"ID":"a75259f2-d600-45f5-b048-12af33294954","Type":"ContainerStarted","Data":"6a885f96db26f8c81352d6b1745f130d1a09aace2f3d3388477ceb20373d0578"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.071706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerStarted","Data":"cdad460eac0011adb94718dc096811ea2034d25ba0ef1617f6648ab95ea4bad9"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.072668 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.087021 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76bf79b888-r57rl" event={"ID":"e42b9899-e1fe-422f-af90-b08348a78694","Type":"ContainerStarted","Data":"2355a06540d2a97f4eefc4159ab8134ced96973c1e9439e57cd57e3314fbfd85"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.087065 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76bf79b888-r57rl" event={"ID":"e42b9899-e1fe-422f-af90-b08348a78694","Type":"ContainerStarted","Data":"4b9d2e7f674a6626c69d9e1030da0e2ce2bd7d96dfd7b3a48606e98f9edd0035"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.087829 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.087857 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.106311 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.548011321 podStartE2EDuration="7.10629082s" podCreationTimestamp="2026-01-30 21:38:01 +0000 UTC" firstStartedPulling="2026-01-30 21:38:02.834259472 +0000 UTC m=+1333.987405620" lastFinishedPulling="2026-01-30 21:38:07.392538991 +0000 UTC m=+1338.545685119" observedRunningTime="2026-01-30 21:38:08.097683468 +0000 UTC m=+1339.250829606" watchObservedRunningTime="2026-01-30 21:38:08.10629082 +0000 UTC m=+1339.259436958" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.155792 4834 generic.go:334] "Generic (PLEG): container finished" podID="8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" containerID="d9aba8e0e02db9fde740f7650bc77fbef01791b4da6c43a97c2e43a8cf1dc167" exitCode=0 Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.155853 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" event={"ID":"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05","Type":"ContainerDied","Data":"d9aba8e0e02db9fde740f7650bc77fbef01791b4da6c43a97c2e43a8cf1dc167"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.160091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24972395-443a-40fc-a82c-e62e4f0e2192","Type":"ContainerStarted","Data":"dfe9c5d4e7feb9bd75c4b1680a38d56954a4274de583068a82817e71e754ccbc"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.161327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6688b77c56-lfnbd" event={"ID":"0067b767-3e5f-41d1-ba56-d64799b81a8c","Type":"ContainerStarted","Data":"92649105945e5fae740063a5468a15769cb5de8f12043adf683338238c63010d"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.161345 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6688b77c56-lfnbd" event={"ID":"0067b767-3e5f-41d1-ba56-d64799b81a8c","Type":"ContainerStarted","Data":"4612a3bba0651f77622deea299f7faf63296bf37d6159fbd1b12258b776b22f0"} Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.189915 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76bf79b888-r57rl" podStartSLOduration=3.18989735 podStartE2EDuration="3.18989735s" podCreationTimestamp="2026-01-30 21:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:08.13154518 +0000 UTC m=+1339.284691308" watchObservedRunningTime="2026-01-30 21:38:08.18989735 +0000 UTC m=+1339.343043488" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.540531 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.574631 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-swift-storage-0\") pod \"a75259f2-d600-45f5-b048-12af33294954\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.574752 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fg4\" (UniqueName: \"kubernetes.io/projected/a75259f2-d600-45f5-b048-12af33294954-kube-api-access-v5fg4\") pod \"a75259f2-d600-45f5-b048-12af33294954\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.574793 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-nb\") pod \"a75259f2-d600-45f5-b048-12af33294954\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.574879 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-config\") pod \"a75259f2-d600-45f5-b048-12af33294954\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.574928 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-svc\") pod \"a75259f2-d600-45f5-b048-12af33294954\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.575074 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-sb\") pod \"a75259f2-d600-45f5-b048-12af33294954\" (UID: \"a75259f2-d600-45f5-b048-12af33294954\") " Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.581184 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75259f2-d600-45f5-b048-12af33294954-kube-api-access-v5fg4" (OuterVolumeSpecName: "kube-api-access-v5fg4") pod "a75259f2-d600-45f5-b048-12af33294954" (UID: "a75259f2-d600-45f5-b048-12af33294954"). InnerVolumeSpecName "kube-api-access-v5fg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.612755 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a75259f2-d600-45f5-b048-12af33294954" (UID: "a75259f2-d600-45f5-b048-12af33294954"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.618027 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a75259f2-d600-45f5-b048-12af33294954" (UID: "a75259f2-d600-45f5-b048-12af33294954"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.618445 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a75259f2-d600-45f5-b048-12af33294954" (UID: "a75259f2-d600-45f5-b048-12af33294954"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.626262 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-config" (OuterVolumeSpecName: "config") pod "a75259f2-d600-45f5-b048-12af33294954" (UID: "a75259f2-d600-45f5-b048-12af33294954"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.673783 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a75259f2-d600-45f5-b048-12af33294954" (UID: "a75259f2-d600-45f5-b048-12af33294954"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.677825 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.677862 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.677878 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5fg4\" (UniqueName: \"kubernetes.io/projected/a75259f2-d600-45f5-b048-12af33294954-kube-api-access-v5fg4\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.677891 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.677902 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:08 crc kubenswrapper[4834]: I0130 21:38:08.677910 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a75259f2-d600-45f5-b048-12af33294954-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.124718 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.176310 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"895eb736-c321-44be-b011-4b381fad441d","Type":"ContainerStarted","Data":"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993"} Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.178479 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" event={"ID":"a75259f2-d600-45f5-b048-12af33294954","Type":"ContainerDied","Data":"6a885f96db26f8c81352d6b1745f130d1a09aace2f3d3388477ceb20373d0578"} Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.178530 4834 scope.go:117] "RemoveContainer" containerID="4fd068ff27a25728928a84b04639808cadce5358b61874bb7852e5c3bd6de430" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.178649 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-pw9sj" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.183732 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" event={"ID":"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05","Type":"ContainerDied","Data":"9bf9d959f9948be1c444ae5fad79c9fb75a4224bf8859c5e91e35ad507693f0f"} Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.183922 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.185066 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s48c\" (UniqueName: \"kubernetes.io/projected/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-kube-api-access-8s48c\") pod \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.185159 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-config\") pod \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.185236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-swift-storage-0\") pod \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.185301 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-svc\") pod \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.185332 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-sb\") pod \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.185557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-nb\") pod \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\" (UID: \"8dfc4a01-3fc1-4360-bd1a-3643d5da2b05\") " Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.189094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-kube-api-access-8s48c" (OuterVolumeSpecName: "kube-api-access-8s48c") pod "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" (UID: "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"). InnerVolumeSpecName "kube-api-access-8s48c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.274025 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" (UID: "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.277558 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-config" (OuterVolumeSpecName: "config") pod "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" (UID: "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.277987 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" (UID: "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.278908 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" (UID: "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.281467 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" (UID: "8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.286444 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-pw9sj"] Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.288761 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.288791 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s48c\" (UniqueName: \"kubernetes.io/projected/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-kube-api-access-8s48c\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.288802 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.288812 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.288822 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.288831 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.294868 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-pw9sj"] Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.444922 4834 scope.go:117] "RemoveContainer" containerID="d9aba8e0e02db9fde740f7650bc77fbef01791b4da6c43a97c2e43a8cf1dc167" Jan 30 21:38:09 crc kubenswrapper[4834]: I0130 21:38:09.558986 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75259f2-d600-45f5-b048-12af33294954" path="/var/lib/kubelet/pods/a75259f2-d600-45f5-b048-12af33294954/volumes" Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.199728 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" event={"ID":"a7d82a62-7407-4012-9f92-8b0abe1afa08","Type":"ContainerStarted","Data":"060e303883dc46c6c5518869c0ea51b9d9a45ae5cf0354e1ffffbb1c1369d29f"} Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.213809 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6688b77c56-lfnbd" event={"ID":"0067b767-3e5f-41d1-ba56-d64799b81a8c","Type":"ContainerStarted","Data":"afd4cf863cc75c6550395bfed730ef3a3323248c97988c6a2e8a2dc82bd3cfd5"} Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.214899 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.236737 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" event={"ID":"f94b59b7-cd95-45e4-ac51-213483a8cd62","Type":"ContainerStarted","Data":"a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0"} Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.237592 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.238011 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.250483 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6688b77c56-lfnbd" podStartSLOduration=4.250468332 podStartE2EDuration="4.250468332s" podCreationTimestamp="2026-01-30 21:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:10.248884667 +0000 UTC m=+1341.402030805" watchObservedRunningTime="2026-01-30 21:38:10.250468332 +0000 UTC m=+1341.403614470" Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.259829 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fffb48c8c-724zs" event={"ID":"e44120e0-1332-4341-8638-917c3f2f1760","Type":"ContainerStarted","Data":"91a6b5711d168b04559de5c04e4b1ecdb7ee87afb9d198a351d7da18689bf882"} Jan 30 21:38:10 crc kubenswrapper[4834]: I0130 21:38:10.281518 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" podStartSLOduration=4.281493004 podStartE2EDuration="4.281493004s" podCreationTimestamp="2026-01-30 21:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:10.268172199 +0000 UTC m=+1341.421318337" watchObservedRunningTime="2026-01-30 21:38:10.281493004 +0000 UTC m=+1341.434639162" Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.269597 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fffb48c8c-724zs" event={"ID":"e44120e0-1332-4341-8638-917c3f2f1760","Type":"ContainerStarted","Data":"02ecd93fa91c9d1b0a3e9ab0eed9aefad76f3378b2f7a2af4caf307149517c0b"} Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.272665 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"895eb736-c321-44be-b011-4b381fad441d","Type":"ContainerStarted","Data":"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0"} Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.272972 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.272791 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api-log" containerID="cri-o://009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993" gracePeriod=30 Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.272743 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api" containerID="cri-o://58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0" gracePeriod=30 Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.279963 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" event={"ID":"a7d82a62-7407-4012-9f92-8b0abe1afa08","Type":"ContainerStarted","Data":"6f984466562609c7d4ddd09225700f7132e02cbed9b8930d6faaaf8aac702ab8"} Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.281706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24972395-443a-40fc-a82c-e62e4f0e2192","Type":"ContainerStarted","Data":"d309fd4b63a2c82d319fbcbe70b6aa1db6c81af8c09994d42551c5f5434d0c5b"} Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.292155 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-fffb48c8c-724zs" podStartSLOduration=2.996141335 podStartE2EDuration="6.292140447s" podCreationTimestamp="2026-01-30 21:38:05 +0000 UTC" firstStartedPulling="2026-01-30 21:38:06.12260817 +0000 UTC m=+1337.275754308" lastFinishedPulling="2026-01-30 21:38:09.418607282 +0000 UTC m=+1340.571753420" observedRunningTime="2026-01-30 21:38:11.289530684 +0000 UTC m=+1342.442676822" watchObservedRunningTime="2026-01-30 21:38:11.292140447 +0000 UTC m=+1342.445286585" Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.322060 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6dd649c8b-p8wq8" podStartSLOduration=3.023585146 podStartE2EDuration="6.322046478s" podCreationTimestamp="2026-01-30 21:38:05 +0000 UTC" firstStartedPulling="2026-01-30 21:38:06.122161967 +0000 UTC m=+1337.275308105" lastFinishedPulling="2026-01-30 21:38:09.420623299 +0000 UTC m=+1340.573769437" observedRunningTime="2026-01-30 21:38:11.314578178 +0000 UTC m=+1342.467724316" watchObservedRunningTime="2026-01-30 21:38:11.322046478 +0000 UTC m=+1342.475192606" Jan 30 21:38:11 crc kubenswrapper[4834]: I0130 21:38:11.356434 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.356413073 podStartE2EDuration="5.356413073s" podCreationTimestamp="2026-01-30 21:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:11.342987066 +0000 UTC m=+1342.496133204" watchObservedRunningTime="2026-01-30 21:38:11.356413073 +0000 UTC m=+1342.509559211" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.182439 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.295893 4834 generic.go:334] "Generic (PLEG): container finished" podID="895eb736-c321-44be-b011-4b381fad441d" containerID="58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0" exitCode=0 Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.295923 4834 generic.go:334] "Generic (PLEG): container finished" podID="895eb736-c321-44be-b011-4b381fad441d" containerID="009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993" exitCode=143 Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.295968 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"895eb736-c321-44be-b011-4b381fad441d","Type":"ContainerDied","Data":"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0"} Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.295996 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"895eb736-c321-44be-b011-4b381fad441d","Type":"ContainerDied","Data":"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993"} Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.296006 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"895eb736-c321-44be-b011-4b381fad441d","Type":"ContainerDied","Data":"40943534bafd2d35f5ac2e96af6555d38d9c1a51ab334ca6c8829d5e9e237fb3"} Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.296021 4834 scope.go:117] "RemoveContainer" containerID="58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.296178 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.313665 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24972395-443a-40fc-a82c-e62e4f0e2192","Type":"ContainerStarted","Data":"c98617e6f9ebcea636edbe6c5ccb33beab7c9a970348b990280b8f7e1a05d437"} Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.333704 4834 scope.go:117] "RemoveContainer" containerID="009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.342989 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.15714584 podStartE2EDuration="6.342974441s" podCreationTimestamp="2026-01-30 21:38:06 +0000 UTC" firstStartedPulling="2026-01-30 21:38:07.370475881 +0000 UTC m=+1338.523622019" lastFinishedPulling="2026-01-30 21:38:09.556304482 +0000 UTC m=+1340.709450620" observedRunningTime="2026-01-30 21:38:12.341794548 +0000 UTC m=+1343.494940686" watchObservedRunningTime="2026-01-30 21:38:12.342974441 +0000 UTC m=+1343.496120579" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367304 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-scripts\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367354 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/895eb736-c321-44be-b011-4b381fad441d-etc-machine-id\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367385 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367428 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-combined-ca-bundle\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367460 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bp4s\" (UniqueName: \"kubernetes.io/projected/895eb736-c321-44be-b011-4b381fad441d-kube-api-access-5bp4s\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367484 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data-custom\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.367534 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895eb736-c321-44be-b011-4b381fad441d-logs\") pod \"895eb736-c321-44be-b011-4b381fad441d\" (UID: \"895eb736-c321-44be-b011-4b381fad441d\") " Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.368602 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/895eb736-c321-44be-b011-4b381fad441d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.371990 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895eb736-c321-44be-b011-4b381fad441d-logs" (OuterVolumeSpecName: "logs") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.385931 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-scripts" (OuterVolumeSpecName: "scripts") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.397565 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.397697 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895eb736-c321-44be-b011-4b381fad441d-kube-api-access-5bp4s" (OuterVolumeSpecName: "kube-api-access-5bp4s") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "kube-api-access-5bp4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.418707 4834 scope.go:117] "RemoveContainer" containerID="58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0" Jan 30 21:38:12 crc kubenswrapper[4834]: E0130 21:38:12.422489 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0\": container with ID starting with 58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0 not found: ID does not exist" containerID="58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.422531 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0"} err="failed to get container status \"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0\": rpc error: code = NotFound desc = could not find container \"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0\": container with ID starting with 58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0 not found: ID does not exist" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.422556 4834 scope.go:117] "RemoveContainer" containerID="009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993" Jan 30 21:38:12 crc kubenswrapper[4834]: E0130 21:38:12.423817 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993\": container with ID starting with 009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993 not found: ID does not exist" containerID="009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.423841 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993"} err="failed to get container status \"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993\": rpc error: code = NotFound desc = could not find container \"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993\": container with ID starting with 009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993 not found: ID does not exist" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.423855 4834 scope.go:117] "RemoveContainer" containerID="58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.424195 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0"} err="failed to get container status \"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0\": rpc error: code = NotFound desc = could not find container \"58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0\": container with ID starting with 58da551dc478d0657ffc599051008f17864614f27d28d4eaadba70b8c7b156d0 not found: ID does not exist" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.424215 4834 scope.go:117] "RemoveContainer" containerID="009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.426315 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993"} err="failed to get container status \"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993\": rpc error: code = NotFound desc = could not find container \"009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993\": container with ID starting with 009df1b2b2143cafa208c78ef88cac4bfd91e121402e10d2b4855813b2e89993 not found: ID does not exist" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.434546 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.469170 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/895eb736-c321-44be-b011-4b381fad441d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.469198 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.469209 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bp4s\" (UniqueName: \"kubernetes.io/projected/895eb736-c321-44be-b011-4b381fad441d-kube-api-access-5bp4s\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.469220 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.469229 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895eb736-c321-44be-b011-4b381fad441d-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.469236 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.496610 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data" (OuterVolumeSpecName: "config-data") pod "895eb736-c321-44be-b011-4b381fad441d" (UID: "895eb736-c321-44be-b011-4b381fad441d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.571242 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895eb736-c321-44be-b011-4b381fad441d-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.650769 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.659105 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673239 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:12 crc kubenswrapper[4834]: E0130 21:38:12.673646 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api-log" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673661 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api-log" Jan 30 21:38:12 crc kubenswrapper[4834]: E0130 21:38:12.673681 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673687 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api" Jan 30 21:38:12 crc kubenswrapper[4834]: E0130 21:38:12.673699 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75259f2-d600-45f5-b048-12af33294954" containerName="init" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673706 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75259f2-d600-45f5-b048-12af33294954" containerName="init" Jan 30 21:38:12 crc kubenswrapper[4834]: E0130 21:38:12.673722 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" containerName="init" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673733 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" containerName="init" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673906 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api-log" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673926 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="895eb736-c321-44be-b011-4b381fad441d" containerName="cinder-api" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673939 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" containerName="init" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.673950 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75259f2-d600-45f5-b048-12af33294954" containerName="init" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.675085 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.682267 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.684066 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.684407 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.685087 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774310 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774442 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-scripts\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774614 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774666 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-logs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774780 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m9p7\" (UniqueName: \"kubernetes.io/projected/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-kube-api-access-9m9p7\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774838 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774866 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.774892 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-config-data\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.778635 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f4bb75569-jlmhp"] Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.780230 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.784784 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.784915 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.789049 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f4bb75569-jlmhp"] Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.877991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878052 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-public-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-combined-ca-bundle\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878117 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-logs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878216 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m9p7\" (UniqueName: \"kubernetes.io/projected/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-kube-api-access-9m9p7\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878243 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-config\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878268 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-internal-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878299 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878376 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-config-data\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878426 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878453 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-httpd-config\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-logs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878498 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-ovndb-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5j9l\" (UniqueName: \"kubernetes.io/projected/9f76b7e7-c7af-4251-b083-406a76dd6a7f-kube-api-access-d5j9l\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878819 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-scripts\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.878904 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.883925 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.884035 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.889234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-config-data-custom\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.889721 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-scripts\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.890216 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.890524 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-config-data\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.918231 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m9p7\" (UniqueName: \"kubernetes.io/projected/d2c94b1f-de81-4b06-ba59-e0466d8cd5c7-kube-api-access-9m9p7\") pod \"cinder-api-0\" (UID: \"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7\") " pod="openstack/cinder-api-0" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980547 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5j9l\" (UniqueName: \"kubernetes.io/projected/9f76b7e7-c7af-4251-b083-406a76dd6a7f-kube-api-access-d5j9l\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980651 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-public-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-combined-ca-bundle\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-config\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980760 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-internal-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980804 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-httpd-config\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.980840 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-ovndb-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.987309 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-config\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.987427 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-combined-ca-bundle\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.987983 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-public-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.988701 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-httpd-config\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:12 crc kubenswrapper[4834]: I0130 21:38:12.997179 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-ovndb-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.001457 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76b7e7-c7af-4251-b083-406a76dd6a7f-internal-tls-certs\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.008251 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.010073 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5j9l\" (UniqueName: \"kubernetes.io/projected/9f76b7e7-c7af-4251-b083-406a76dd6a7f-kube-api-access-d5j9l\") pod \"neutron-f4bb75569-jlmhp\" (UID: \"9f76b7e7-c7af-4251-b083-406a76dd6a7f\") " pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.144872 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.518473 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 30 21:38:13 crc kubenswrapper[4834]: W0130 21:38:13.535969 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2c94b1f_de81_4b06_ba59_e0466d8cd5c7.slice/crio-b53202b8a70ce1d221949dbb8e9e84b8d81d56541d900ed4f69e26fa69218146 WatchSource:0}: Error finding container b53202b8a70ce1d221949dbb8e9e84b8d81d56541d900ed4f69e26fa69218146: Status 404 returned error can't find the container with id b53202b8a70ce1d221949dbb8e9e84b8d81d56541d900ed4f69e26fa69218146 Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.554257 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895eb736-c321-44be-b011-4b381fad441d" path="/var/lib/kubelet/pods/895eb736-c321-44be-b011-4b381fad441d/volumes" Jan 30 21:38:13 crc kubenswrapper[4834]: I0130 21:38:13.706100 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f4bb75569-jlmhp"] Jan 30 21:38:13 crc kubenswrapper[4834]: W0130 21:38:13.731520 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f76b7e7_c7af_4251_b083_406a76dd6a7f.slice/crio-7f466cb58be999b99f660681c25da3c55c06f016ddba36253491fbfc1066e856 WatchSource:0}: Error finding container 7f466cb58be999b99f660681c25da3c55c06f016ddba36253491fbfc1066e856: Status 404 returned error can't find the container with id 7f466cb58be999b99f660681c25da3c55c06f016ddba36253491fbfc1066e856 Jan 30 21:38:14 crc kubenswrapper[4834]: I0130 21:38:14.367806 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4bb75569-jlmhp" event={"ID":"9f76b7e7-c7af-4251-b083-406a76dd6a7f","Type":"ContainerStarted","Data":"a4a3ec22d97696ca2eb31c4653b5e2f4a6e38c85f0223277eb9829950dbc9d86"} Jan 30 21:38:14 crc kubenswrapper[4834]: I0130 21:38:14.368063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4bb75569-jlmhp" event={"ID":"9f76b7e7-c7af-4251-b083-406a76dd6a7f","Type":"ContainerStarted","Data":"7f466cb58be999b99f660681c25da3c55c06f016ddba36253491fbfc1066e856"} Jan 30 21:38:14 crc kubenswrapper[4834]: I0130 21:38:14.369372 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7","Type":"ContainerStarted","Data":"5700a784767dfbe2886824fbdf3527cc4fa871b85e3a1d7fe9a9bc022e4d9796"} Jan 30 21:38:14 crc kubenswrapper[4834]: I0130 21:38:14.369409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7","Type":"ContainerStarted","Data":"b53202b8a70ce1d221949dbb8e9e84b8d81d56541d900ed4f69e26fa69218146"} Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.380213 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f4bb75569-jlmhp" event={"ID":"9f76b7e7-c7af-4251-b083-406a76dd6a7f","Type":"ContainerStarted","Data":"1a674e8f043b5a5dde96f92b075f9d942a211e0fc33e9de988369483bc691458"} Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.381539 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.382215 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d2c94b1f-de81-4b06-ba59-e0466d8cd5c7","Type":"ContainerStarted","Data":"adfc1828779df887a5d12c48f703f1cafa5bece7da4184600cf880a81ecba00f"} Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.382347 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.398028 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f4bb75569-jlmhp" podStartSLOduration=3.397998451 podStartE2EDuration="3.397998451s" podCreationTimestamp="2026-01-30 21:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:15.3965179 +0000 UTC m=+1346.549664038" watchObservedRunningTime="2026-01-30 21:38:15.397998451 +0000 UTC m=+1346.551144589" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.431767 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.43174639 podStartE2EDuration="3.43174639s" podCreationTimestamp="2026-01-30 21:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:15.425478124 +0000 UTC m=+1346.578624272" watchObservedRunningTime="2026-01-30 21:38:15.43174639 +0000 UTC m=+1346.584892528" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.553552 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7ff95bcc58-fzksr"] Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.555012 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.558276 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.558541 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.566223 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff95bcc58-fzksr"] Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.754534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-internal-tls-certs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.754603 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-config-data\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.754740 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-config-data-custom\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.754841 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-public-tls-certs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.754995 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de560559-c091-48f7-a2d2-f9f0fffcec65-logs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.755141 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chk8l\" (UniqueName: \"kubernetes.io/projected/de560559-c091-48f7-a2d2-f9f0fffcec65-kube-api-access-chk8l\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.755225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-combined-ca-bundle\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.858003 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chk8l\" (UniqueName: \"kubernetes.io/projected/de560559-c091-48f7-a2d2-f9f0fffcec65-kube-api-access-chk8l\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.858085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-combined-ca-bundle\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.858119 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-internal-tls-certs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.858160 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-config-data\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.859321 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-config-data-custom\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.859355 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-public-tls-certs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.859425 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de560559-c091-48f7-a2d2-f9f0fffcec65-logs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.859738 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de560559-c091-48f7-a2d2-f9f0fffcec65-logs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.867107 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-combined-ca-bundle\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.867123 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-public-tls-certs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.867546 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-config-data\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.868231 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-internal-tls-certs\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.868294 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de560559-c091-48f7-a2d2-f9f0fffcec65-config-data-custom\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:15 crc kubenswrapper[4834]: I0130 21:38:15.890931 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chk8l\" (UniqueName: \"kubernetes.io/projected/de560559-c091-48f7-a2d2-f9f0fffcec65-kube-api-access-chk8l\") pod \"barbican-api-7ff95bcc58-fzksr\" (UID: \"de560559-c091-48f7-a2d2-f9f0fffcec65\") " pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:16 crc kubenswrapper[4834]: I0130 21:38:16.174104 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:16 crc kubenswrapper[4834]: I0130 21:38:16.757254 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7ff95bcc58-fzksr"] Jan 30 21:38:16 crc kubenswrapper[4834]: W0130 21:38:16.758825 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde560559_c091_48f7_a2d2_f9f0fffcec65.slice/crio-266bcaaa64e97f94a816e6dc6ef2f7a4b5f8740bf0fb6b93f0678b3998e37d41 WatchSource:0}: Error finding container 266bcaaa64e97f94a816e6dc6ef2f7a4b5f8740bf0fb6b93f0678b3998e37d41: Status 404 returned error can't find the container with id 266bcaaa64e97f94a816e6dc6ef2f7a4b5f8740bf0fb6b93f0678b3998e37d41 Jan 30 21:38:16 crc kubenswrapper[4834]: I0130 21:38:16.772274 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.003940 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.085695 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.162723 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-4lk6b"] Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.162983 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerName="dnsmasq-dns" containerID="cri-o://f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57" gracePeriod=10 Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.418420 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff95bcc58-fzksr" event={"ID":"de560559-c091-48f7-a2d2-f9f0fffcec65","Type":"ContainerStarted","Data":"266bcaaa64e97f94a816e6dc6ef2f7a4b5f8740bf0fb6b93f0678b3998e37d41"} Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.510463 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.707727 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:17 crc kubenswrapper[4834]: I0130 21:38:17.775997 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.315459 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.322703 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-svc\") pod \"3dd33833-530e-4a48-9a21-c03a37d1c253\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.322791 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-nb\") pod \"3dd33833-530e-4a48-9a21-c03a37d1c253\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.322825 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-swift-storage-0\") pod \"3dd33833-530e-4a48-9a21-c03a37d1c253\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.322891 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-config\") pod \"3dd33833-530e-4a48-9a21-c03a37d1c253\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.322926 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-sb\") pod \"3dd33833-530e-4a48-9a21-c03a37d1c253\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.322986 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g69pq\" (UniqueName: \"kubernetes.io/projected/3dd33833-530e-4a48-9a21-c03a37d1c253-kube-api-access-g69pq\") pod \"3dd33833-530e-4a48-9a21-c03a37d1c253\" (UID: \"3dd33833-530e-4a48-9a21-c03a37d1c253\") " Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.332595 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd33833-530e-4a48-9a21-c03a37d1c253-kube-api-access-g69pq" (OuterVolumeSpecName: "kube-api-access-g69pq") pod "3dd33833-530e-4a48-9a21-c03a37d1c253" (UID: "3dd33833-530e-4a48-9a21-c03a37d1c253"). InnerVolumeSpecName "kube-api-access-g69pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.431596 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g69pq\" (UniqueName: \"kubernetes.io/projected/3dd33833-530e-4a48-9a21-c03a37d1c253-kube-api-access-g69pq\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.468599 4834 generic.go:334] "Generic (PLEG): container finished" podID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerID="f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57" exitCode=0 Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.468689 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" event={"ID":"3dd33833-530e-4a48-9a21-c03a37d1c253","Type":"ContainerDied","Data":"f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57"} Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.468721 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" event={"ID":"3dd33833-530e-4a48-9a21-c03a37d1c253","Type":"ContainerDied","Data":"f86e0085ac0484154b7953f08ef10c27d4180f5876e4f9fc16822477b210eba5"} Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.468738 4834 scope.go:117] "RemoveContainer" containerID="f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.468902 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-4lk6b" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.475767 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="cinder-scheduler" containerID="cri-o://d309fd4b63a2c82d319fbcbe70b6aa1db6c81af8c09994d42551c5f5434d0c5b" gracePeriod=30 Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.475788 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff95bcc58-fzksr" event={"ID":"de560559-c091-48f7-a2d2-f9f0fffcec65","Type":"ContainerStarted","Data":"1e5adffb4d581c3c6218e583dbf24bcb5c0b46c738514e90816533deb03614ee"} Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.475822 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7ff95bcc58-fzksr" event={"ID":"de560559-c091-48f7-a2d2-f9f0fffcec65","Type":"ContainerStarted","Data":"1cb55d85e97c5de5b9ffe495c959ff8a2f5a9bb2d3313861f55b343e2a3bc95c"} Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.475838 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.475849 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.475885 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="probe" containerID="cri-o://c98617e6f9ebcea636edbe6c5ccb33beab7c9a970348b990280b8f7e1a05d437" gracePeriod=30 Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.478039 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3dd33833-530e-4a48-9a21-c03a37d1c253" (UID: "3dd33833-530e-4a48-9a21-c03a37d1c253"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.486610 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dd33833-530e-4a48-9a21-c03a37d1c253" (UID: "3dd33833-530e-4a48-9a21-c03a37d1c253"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.493811 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dd33833-530e-4a48-9a21-c03a37d1c253" (UID: "3dd33833-530e-4a48-9a21-c03a37d1c253"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.502026 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7ff95bcc58-fzksr" podStartSLOduration=3.5020069879999998 podStartE2EDuration="3.502006988s" podCreationTimestamp="2026-01-30 21:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:18.500759303 +0000 UTC m=+1349.653905441" watchObservedRunningTime="2026-01-30 21:38:18.502006988 +0000 UTC m=+1349.655153126" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.522635 4834 scope.go:117] "RemoveContainer" containerID="93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.522876 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-config" (OuterVolumeSpecName: "config") pod "3dd33833-530e-4a48-9a21-c03a37d1c253" (UID: "3dd33833-530e-4a48-9a21-c03a37d1c253"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.531443 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dd33833-530e-4a48-9a21-c03a37d1c253" (UID: "3dd33833-530e-4a48-9a21-c03a37d1c253"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.532590 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.532610 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.532620 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.532628 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.532637 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd33833-530e-4a48-9a21-c03a37d1c253-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.541541 4834 scope.go:117] "RemoveContainer" containerID="f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57" Jan 30 21:38:18 crc kubenswrapper[4834]: E0130 21:38:18.541968 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57\": container with ID starting with f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57 not found: ID does not exist" containerID="f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.541996 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57"} err="failed to get container status \"f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57\": rpc error: code = NotFound desc = could not find container \"f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57\": container with ID starting with f12430c49823b8516a4bd183c73f85777ae6da532e93a6b5dec5b509345a6e57 not found: ID does not exist" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.542034 4834 scope.go:117] "RemoveContainer" containerID="93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c" Jan 30 21:38:18 crc kubenswrapper[4834]: E0130 21:38:18.542248 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c\": container with ID starting with 93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c not found: ID does not exist" containerID="93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.542267 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c"} err="failed to get container status \"93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c\": rpc error: code = NotFound desc = could not find container \"93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c\": container with ID starting with 93067bc2e9b35210263322f0b2f73414b05942a4a83ef6ce1edb8d4ade10d64c not found: ID does not exist" Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.798149 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-4lk6b"] Jan 30 21:38:18 crc kubenswrapper[4834]: I0130 21:38:18.806558 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-4lk6b"] Jan 30 21:38:19 crc kubenswrapper[4834]: I0130 21:38:19.484231 4834 generic.go:334] "Generic (PLEG): container finished" podID="24972395-443a-40fc-a82c-e62e4f0e2192" containerID="c98617e6f9ebcea636edbe6c5ccb33beab7c9a970348b990280b8f7e1a05d437" exitCode=0 Jan 30 21:38:19 crc kubenswrapper[4834]: I0130 21:38:19.484314 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24972395-443a-40fc-a82c-e62e4f0e2192","Type":"ContainerDied","Data":"c98617e6f9ebcea636edbe6c5ccb33beab7c9a970348b990280b8f7e1a05d437"} Jan 30 21:38:19 crc kubenswrapper[4834]: I0130 21:38:19.575060 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" path="/var/lib/kubelet/pods/3dd33833-530e-4a48-9a21-c03a37d1c253/volumes" Jan 30 21:38:19 crc kubenswrapper[4834]: I0130 21:38:19.575996 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5dffc79858-5v9vm" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.525349 4834 generic.go:334] "Generic (PLEG): container finished" podID="24972395-443a-40fc-a82c-e62e4f0e2192" containerID="d309fd4b63a2c82d319fbcbe70b6aa1db6c81af8c09994d42551c5f5434d0c5b" exitCode=0 Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.526789 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24972395-443a-40fc-a82c-e62e4f0e2192","Type":"ContainerDied","Data":"d309fd4b63a2c82d319fbcbe70b6aa1db6c81af8c09994d42551c5f5434d0c5b"} Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.526943 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"24972395-443a-40fc-a82c-e62e4f0e2192","Type":"ContainerDied","Data":"dfe9c5d4e7feb9bd75c4b1680a38d56954a4274de583068a82817e71e754ccbc"} Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.527030 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfe9c5d4e7feb9bd75c4b1680a38d56954a4274de583068a82817e71e754ccbc" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.565578 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.660866 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 30 21:38:23 crc kubenswrapper[4834]: E0130 21:38:23.661277 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerName="dnsmasq-dns" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661305 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerName="dnsmasq-dns" Jan 30 21:38:23 crc kubenswrapper[4834]: E0130 21:38:23.661479 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="cinder-scheduler" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661491 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="cinder-scheduler" Jan 30 21:38:23 crc kubenswrapper[4834]: E0130 21:38:23.661506 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerName="init" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661516 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerName="init" Jan 30 21:38:23 crc kubenswrapper[4834]: E0130 21:38:23.661549 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="probe" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661558 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="probe" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661811 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="probe" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661842 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" containerName="cinder-scheduler" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.661857 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd33833-530e-4a48-9a21-c03a37d1c253" containerName="dnsmasq-dns" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.662671 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.665652 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.666309 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.666601 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gd8hm" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.675946 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753487 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dttrk\" (UniqueName: \"kubernetes.io/projected/24972395-443a-40fc-a82c-e62e4f0e2192-kube-api-access-dttrk\") pod \"24972395-443a-40fc-a82c-e62e4f0e2192\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753562 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24972395-443a-40fc-a82c-e62e4f0e2192-etc-machine-id\") pod \"24972395-443a-40fc-a82c-e62e4f0e2192\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753621 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-scripts\") pod \"24972395-443a-40fc-a82c-e62e4f0e2192\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753643 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-combined-ca-bundle\") pod \"24972395-443a-40fc-a82c-e62e4f0e2192\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753699 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data\") pod \"24972395-443a-40fc-a82c-e62e4f0e2192\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753716 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24972395-443a-40fc-a82c-e62e4f0e2192-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24972395-443a-40fc-a82c-e62e4f0e2192" (UID: "24972395-443a-40fc-a82c-e62e4f0e2192"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.753828 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data-custom\") pod \"24972395-443a-40fc-a82c-e62e4f0e2192\" (UID: \"24972395-443a-40fc-a82c-e62e4f0e2192\") " Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.754219 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24972395-443a-40fc-a82c-e62e4f0e2192-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.762524 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-scripts" (OuterVolumeSpecName: "scripts") pod "24972395-443a-40fc-a82c-e62e4f0e2192" (UID: "24972395-443a-40fc-a82c-e62e4f0e2192"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.766519 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24972395-443a-40fc-a82c-e62e4f0e2192" (UID: "24972395-443a-40fc-a82c-e62e4f0e2192"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.767602 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24972395-443a-40fc-a82c-e62e4f0e2192-kube-api-access-dttrk" (OuterVolumeSpecName: "kube-api-access-dttrk") pod "24972395-443a-40fc-a82c-e62e4f0e2192" (UID: "24972395-443a-40fc-a82c-e62e4f0e2192"). InnerVolumeSpecName "kube-api-access-dttrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.823581 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24972395-443a-40fc-a82c-e62e4f0e2192" (UID: "24972395-443a-40fc-a82c-e62e4f0e2192"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.855763 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bda10687-cb12-404d-a99f-366f499918ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.855844 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda10687-cb12-404d-a99f-366f499918ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.855896 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bda10687-cb12-404d-a99f-366f499918ec-openstack-config\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.856177 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhqm\" (UniqueName: \"kubernetes.io/projected/bda10687-cb12-404d-a99f-366f499918ec-kube-api-access-hzhqm\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.856260 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.856275 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.856286 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.856295 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dttrk\" (UniqueName: \"kubernetes.io/projected/24972395-443a-40fc-a82c-e62e4f0e2192-kube-api-access-dttrk\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.904497 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data" (OuterVolumeSpecName: "config-data") pod "24972395-443a-40fc-a82c-e62e4f0e2192" (UID: "24972395-443a-40fc-a82c-e62e4f0e2192"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.958352 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda10687-cb12-404d-a99f-366f499918ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.958444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bda10687-cb12-404d-a99f-366f499918ec-openstack-config\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.958484 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhqm\" (UniqueName: \"kubernetes.io/projected/bda10687-cb12-404d-a99f-366f499918ec-kube-api-access-hzhqm\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.958553 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bda10687-cb12-404d-a99f-366f499918ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.958622 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24972395-443a-40fc-a82c-e62e4f0e2192-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.959436 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bda10687-cb12-404d-a99f-366f499918ec-openstack-config\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.962825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda10687-cb12-404d-a99f-366f499918ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.962863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bda10687-cb12-404d-a99f-366f499918ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.979137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhqm\" (UniqueName: \"kubernetes.io/projected/bda10687-cb12-404d-a99f-366f499918ec-kube-api-access-hzhqm\") pod \"openstackclient\" (UID: \"bda10687-cb12-404d-a99f-366f499918ec\") " pod="openstack/openstackclient" Jan 30 21:38:23 crc kubenswrapper[4834]: I0130 21:38:23.995180 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.535514 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.580860 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.592510 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.608113 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.610200 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.615997 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.616649 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.624308 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.776262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-scripts\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.776349 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.776566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-config-data\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.776658 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsft5\" (UniqueName: \"kubernetes.io/projected/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-kube-api-access-qsft5\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.776722 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.776802 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.878751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-config-data\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.878862 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsft5\" (UniqueName: \"kubernetes.io/projected/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-kube-api-access-qsft5\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.878889 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.878959 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.879035 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-scripts\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.879069 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.879163 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.889283 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.902971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-scripts\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.903260 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.903541 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-config-data\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.908941 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsft5\" (UniqueName: \"kubernetes.io/projected/8471f1a0-28a4-4c47-a5a0-77e84eb88e30-kube-api-access-qsft5\") pod \"cinder-scheduler-0\" (UID: \"8471f1a0-28a4-4c47-a5a0-77e84eb88e30\") " pod="openstack/cinder-scheduler-0" Jan 30 21:38:24 crc kubenswrapper[4834]: I0130 21:38:24.984066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 30 21:38:25 crc kubenswrapper[4834]: I0130 21:38:25.251772 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 30 21:38:25 crc kubenswrapper[4834]: I0130 21:38:25.519129 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 30 21:38:25 crc kubenswrapper[4834]: W0130 21:38:25.532910 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8471f1a0_28a4_4c47_a5a0_77e84eb88e30.slice/crio-5390d45825f4060eab567cc998a939cf29f5bd1d85ac076e091a1bbaf9a81eac WatchSource:0}: Error finding container 5390d45825f4060eab567cc998a939cf29f5bd1d85ac076e091a1bbaf9a81eac: Status 404 returned error can't find the container with id 5390d45825f4060eab567cc998a939cf29f5bd1d85ac076e091a1bbaf9a81eac Jan 30 21:38:25 crc kubenswrapper[4834]: I0130 21:38:25.546363 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24972395-443a-40fc-a82c-e62e4f0e2192" path="/var/lib/kubelet/pods/24972395-443a-40fc-a82c-e62e4f0e2192/volumes" Jan 30 21:38:25 crc kubenswrapper[4834]: I0130 21:38:25.561533 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8471f1a0-28a4-4c47-a5a0-77e84eb88e30","Type":"ContainerStarted","Data":"5390d45825f4060eab567cc998a939cf29f5bd1d85ac076e091a1bbaf9a81eac"} Jan 30 21:38:25 crc kubenswrapper[4834]: I0130 21:38:25.565566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bda10687-cb12-404d-a99f-366f499918ec","Type":"ContainerStarted","Data":"dccad40395eb05e717a59bb0a6fedb3e22f170f4558603aee75ed8348bf0f04a"} Jan 30 21:38:26 crc kubenswrapper[4834]: I0130 21:38:26.592740 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8471f1a0-28a4-4c47-a5a0-77e84eb88e30","Type":"ContainerStarted","Data":"b234c5efc8c4ab395feb3e0b03bfd3b0d22a59f290a7fdeab9f4a93ce62b85fc"} Jan 30 21:38:27 crc kubenswrapper[4834]: I0130 21:38:27.614909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8471f1a0-28a4-4c47-a5a0-77e84eb88e30","Type":"ContainerStarted","Data":"249fd746f56e84e2600bf7d3a6f332f2232cc5946e2ed2c9ea7fa8bbc3714e13"} Jan 30 21:38:27 crc kubenswrapper[4834]: I0130 21:38:27.654366 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.654340732 podStartE2EDuration="3.654340732s" podCreationTimestamp="2026-01-30 21:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:27.639167616 +0000 UTC m=+1358.792313764" watchObservedRunningTime="2026-01-30 21:38:27.654340732 +0000 UTC m=+1358.807486870" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.065891 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.114552 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7ff95bcc58-fzksr" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.192565 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76bf79b888-r57rl"] Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.192830 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76bf79b888-r57rl" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api-log" containerID="cri-o://4b9d2e7f674a6626c69d9e1030da0e2ce2bd7d96dfd7b3a48606e98f9edd0035" gracePeriod=30 Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.193417 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76bf79b888-r57rl" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api" containerID="cri-o://2355a06540d2a97f4eefc4159ab8134ced96973c1e9439e57cd57e3314fbfd85" gracePeriod=30 Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.628423 4834 generic.go:334] "Generic (PLEG): container finished" podID="e42b9899-e1fe-422f-af90-b08348a78694" containerID="4b9d2e7f674a6626c69d9e1030da0e2ce2bd7d96dfd7b3a48606e98f9edd0035" exitCode=143 Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.628495 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76bf79b888-r57rl" event={"ID":"e42b9899-e1fe-422f-af90-b08348a78694","Type":"ContainerDied","Data":"4b9d2e7f674a6626c69d9e1030da0e2ce2bd7d96dfd7b3a48606e98f9edd0035"} Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.935696 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5bb7f7bbcf-bjdrn"] Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.937094 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.947138 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.947328 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.948942 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 30 21:38:28 crc kubenswrapper[4834]: I0130 21:38:28.962978 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bb7f7bbcf-bjdrn"] Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.073715 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-config-data\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.073915 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-combined-ca-bundle\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.074140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a480b4-e084-4ea8-b438-6a3c217b4514-log-httpd\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.074215 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a480b4-e084-4ea8-b438-6a3c217b4514-run-httpd\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.074295 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqksx\" (UniqueName: \"kubernetes.io/projected/c8a480b4-e084-4ea8-b438-6a3c217b4514-kube-api-access-fqksx\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.074334 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-public-tls-certs\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.074382 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-internal-tls-certs\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.074444 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8a480b4-e084-4ea8-b438-6a3c217b4514-etc-swift\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176670 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a480b4-e084-4ea8-b438-6a3c217b4514-log-httpd\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176725 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a480b4-e084-4ea8-b438-6a3c217b4514-run-httpd\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176763 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqksx\" (UniqueName: \"kubernetes.io/projected/c8a480b4-e084-4ea8-b438-6a3c217b4514-kube-api-access-fqksx\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-public-tls-certs\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-internal-tls-certs\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176855 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8a480b4-e084-4ea8-b438-6a3c217b4514-etc-swift\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176910 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-config-data\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.176958 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-combined-ca-bundle\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.177619 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a480b4-e084-4ea8-b438-6a3c217b4514-log-httpd\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.178217 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8a480b4-e084-4ea8-b438-6a3c217b4514-run-httpd\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.191915 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-combined-ca-bundle\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.193264 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-internal-tls-certs\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.193980 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-public-tls-certs\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.201149 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8a480b4-e084-4ea8-b438-6a3c217b4514-config-data\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.202670 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqksx\" (UniqueName: \"kubernetes.io/projected/c8a480b4-e084-4ea8-b438-6a3c217b4514-kube-api-access-fqksx\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.214667 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8a480b4-e084-4ea8-b438-6a3c217b4514-etc-swift\") pod \"swift-proxy-5bb7f7bbcf-bjdrn\" (UID: \"c8a480b4-e084-4ea8-b438-6a3c217b4514\") " pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.253854 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.362746 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65cd556484-hkshq" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.362986 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65cd556484-hkshq" Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.884187 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5bb7f7bbcf-bjdrn"] Jan 30 21:38:29 crc kubenswrapper[4834]: W0130 21:38:29.900140 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8a480b4_e084_4ea8_b438_6a3c217b4514.slice/crio-b7b3042de30f35da398c37a04a680a956a5a80ad5bcf5880aa94577d221d1672 WatchSource:0}: Error finding container b7b3042de30f35da398c37a04a680a956a5a80ad5bcf5880aa94577d221d1672: Status 404 returned error can't find the container with id b7b3042de30f35da398c37a04a680a956a5a80ad5bcf5880aa94577d221d1672 Jan 30 21:38:29 crc kubenswrapper[4834]: I0130 21:38:29.984484 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.048213 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.048544 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-central-agent" containerID="cri-o://b5253a399f45f16b3687052f9a2ec21dfa1af7be25a435f31860dcc04d837d78" gracePeriod=30 Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.048890 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="sg-core" containerID="cri-o://1e37564284014c67f25364cd245b64f6f167b667a5d1d6abb7b789f79149144d" gracePeriod=30 Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.048975 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="proxy-httpd" containerID="cri-o://cdad460eac0011adb94718dc096811ea2034d25ba0ef1617f6648ab95ea4bad9" gracePeriod=30 Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.049040 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-notification-agent" containerID="cri-o://b3d780c50a321a5a6f2fcd20f56ec4c15b04a7bb0d4028c54a47657da158a48d" gracePeriod=30 Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.069753 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.662940 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" event={"ID":"c8a480b4-e084-4ea8-b438-6a3c217b4514","Type":"ContainerStarted","Data":"5657af107956e7e56a0b03f1fb23a865163937e7dd511182b839c761053a281e"} Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.663193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" event={"ID":"c8a480b4-e084-4ea8-b438-6a3c217b4514","Type":"ContainerStarted","Data":"f74333b519526450b08be51f8df4b681dc2b9b2915f395226599774558f91a4a"} Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.663203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" event={"ID":"c8a480b4-e084-4ea8-b438-6a3c217b4514","Type":"ContainerStarted","Data":"b7b3042de30f35da398c37a04a680a956a5a80ad5bcf5880aa94577d221d1672"} Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.663416 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.663478 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.672501 4834 generic.go:334] "Generic (PLEG): container finished" podID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerID="cdad460eac0011adb94718dc096811ea2034d25ba0ef1617f6648ab95ea4bad9" exitCode=0 Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.672540 4834 generic.go:334] "Generic (PLEG): container finished" podID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerID="1e37564284014c67f25364cd245b64f6f167b667a5d1d6abb7b789f79149144d" exitCode=2 Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.672575 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerDied","Data":"cdad460eac0011adb94718dc096811ea2034d25ba0ef1617f6648ab95ea4bad9"} Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.672631 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerDied","Data":"1e37564284014c67f25364cd245b64f6f167b667a5d1d6abb7b789f79149144d"} Jan 30 21:38:30 crc kubenswrapper[4834]: I0130 21:38:30.690298 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" podStartSLOduration=2.690281496 podStartE2EDuration="2.690281496s" podCreationTimestamp="2026-01-30 21:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:30.683444923 +0000 UTC m=+1361.836591061" watchObservedRunningTime="2026-01-30 21:38:30.690281496 +0000 UTC m=+1361.843427634" Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.411103 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76bf79b888-r57rl" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": read tcp 10.217.0.2:36874->10.217.0.172:9311: read: connection reset by peer" Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.411175 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76bf79b888-r57rl" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.172:9311/healthcheck\": read tcp 10.217.0.2:36882->10.217.0.172:9311: read: connection reset by peer" Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.684732 4834 generic.go:334] "Generic (PLEG): container finished" podID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerID="b5253a399f45f16b3687052f9a2ec21dfa1af7be25a435f31860dcc04d837d78" exitCode=0 Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.685648 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerDied","Data":"b5253a399f45f16b3687052f9a2ec21dfa1af7be25a435f31860dcc04d837d78"} Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.691089 4834 generic.go:334] "Generic (PLEG): container finished" podID="e42b9899-e1fe-422f-af90-b08348a78694" containerID="2355a06540d2a97f4eefc4159ab8134ced96973c1e9439e57cd57e3314fbfd85" exitCode=0 Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.691433 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76bf79b888-r57rl" event={"ID":"e42b9899-e1fe-422f-af90-b08348a78694","Type":"ContainerDied","Data":"2355a06540d2a97f4eefc4159ab8134ced96973c1e9439e57cd57e3314fbfd85"} Jan 30 21:38:31 crc kubenswrapper[4834]: I0130 21:38:31.902329 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.050286 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kssdg\" (UniqueName: \"kubernetes.io/projected/e42b9899-e1fe-422f-af90-b08348a78694-kube-api-access-kssdg\") pod \"e42b9899-e1fe-422f-af90-b08348a78694\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.050427 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42b9899-e1fe-422f-af90-b08348a78694-logs\") pod \"e42b9899-e1fe-422f-af90-b08348a78694\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.050456 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data\") pod \"e42b9899-e1fe-422f-af90-b08348a78694\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.050562 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data-custom\") pod \"e42b9899-e1fe-422f-af90-b08348a78694\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.050635 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-combined-ca-bundle\") pod \"e42b9899-e1fe-422f-af90-b08348a78694\" (UID: \"e42b9899-e1fe-422f-af90-b08348a78694\") " Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.050832 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42b9899-e1fe-422f-af90-b08348a78694-logs" (OuterVolumeSpecName: "logs") pod "e42b9899-e1fe-422f-af90-b08348a78694" (UID: "e42b9899-e1fe-422f-af90-b08348a78694"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.051092 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42b9899-e1fe-422f-af90-b08348a78694-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.057599 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42b9899-e1fe-422f-af90-b08348a78694-kube-api-access-kssdg" (OuterVolumeSpecName: "kube-api-access-kssdg") pod "e42b9899-e1fe-422f-af90-b08348a78694" (UID: "e42b9899-e1fe-422f-af90-b08348a78694"). InnerVolumeSpecName "kube-api-access-kssdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.069562 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e42b9899-e1fe-422f-af90-b08348a78694" (UID: "e42b9899-e1fe-422f-af90-b08348a78694"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.121436 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42b9899-e1fe-422f-af90-b08348a78694" (UID: "e42b9899-e1fe-422f-af90-b08348a78694"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.125419 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data" (OuterVolumeSpecName: "config-data") pod "e42b9899-e1fe-422f-af90-b08348a78694" (UID: "e42b9899-e1fe-422f-af90-b08348a78694"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.153162 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kssdg\" (UniqueName: \"kubernetes.io/projected/e42b9899-e1fe-422f-af90-b08348a78694-kube-api-access-kssdg\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.153576 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.153677 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.153795 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42b9899-e1fe-422f-af90-b08348a78694-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.327820 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": dial tcp 10.217.0.168:3000: connect: connection refused" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.732608 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76bf79b888-r57rl" event={"ID":"e42b9899-e1fe-422f-af90-b08348a78694","Type":"ContainerDied","Data":"7c88b808af391a25a66e501c1f354a167ff7ffdbadfe3e3352968d40aeabf633"} Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.732676 4834 scope.go:117] "RemoveContainer" containerID="2355a06540d2a97f4eefc4159ab8134ced96973c1e9439e57cd57e3314fbfd85" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.732895 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76bf79b888-r57rl" Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.771457 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76bf79b888-r57rl"] Jan 30 21:38:32 crc kubenswrapper[4834]: I0130 21:38:32.779273 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76bf79b888-r57rl"] Jan 30 21:38:33 crc kubenswrapper[4834]: I0130 21:38:33.545709 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42b9899-e1fe-422f-af90-b08348a78694" path="/var/lib/kubelet/pods/e42b9899-e1fe-422f-af90-b08348a78694/volumes" Jan 30 21:38:33 crc kubenswrapper[4834]: I0130 21:38:33.744033 4834 generic.go:334] "Generic (PLEG): container finished" podID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerID="b3d780c50a321a5a6f2fcd20f56ec4c15b04a7bb0d4028c54a47657da158a48d" exitCode=0 Jan 30 21:38:33 crc kubenswrapper[4834]: I0130 21:38:33.744086 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerDied","Data":"b3d780c50a321a5a6f2fcd20f56ec4c15b04a7bb0d4028c54a47657da158a48d"} Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.161030 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.161099 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.161150 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.162096 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8384069132f18eea6ac87d501b64935494bfc35764a079472903e1922c44d982"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.162168 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://8384069132f18eea6ac87d501b64935494bfc35764a079472903e1922c44d982" gracePeriod=600 Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.755518 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="8384069132f18eea6ac87d501b64935494bfc35764a079472903e1922c44d982" exitCode=0 Jan 30 21:38:34 crc kubenswrapper[4834]: I0130 21:38:34.755559 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"8384069132f18eea6ac87d501b64935494bfc35764a079472903e1922c44d982"} Jan 30 21:38:35 crc kubenswrapper[4834]: I0130 21:38:35.217963 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 30 21:38:36 crc kubenswrapper[4834]: I0130 21:38:36.820339 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:38:36 crc kubenswrapper[4834]: I0130 21:38:36.820808 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-log" containerID="cri-o://39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122" gracePeriod=30 Jan 30 21:38:36 crc kubenswrapper[4834]: I0130 21:38:36.820943 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-httpd" containerID="cri-o://206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789" gracePeriod=30 Jan 30 21:38:36 crc kubenswrapper[4834]: I0130 21:38:36.966614 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.667536 4834 scope.go:117] "RemoveContainer" containerID="4b9d2e7f674a6626c69d9e1030da0e2ce2bd7d96dfd7b3a48606e98f9edd0035" Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.731559 4834 scope.go:117] "RemoveContainer" containerID="75ffc2f37f0663828c033fce2d59c1e7b940cbd240e347042d2341fa7fcc4ac6" Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.809964 4834 generic.go:334] "Generic (PLEG): container finished" podID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerID="39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122" exitCode=143 Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.810285 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e39cea01-f258-49de-a89e-380cc2ccdbb1","Type":"ContainerDied","Data":"39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122"} Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.835331 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.835566 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-log" containerID="cri-o://43decc1b912af4a1c561492b82cda3a337dd6d4896c7395e9fa11b341ae505d1" gracePeriod=30 Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.836199 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-httpd" containerID="cri-o://379f553bea4d38792a3b231c9d0af167a7bb371728d3a16701c761a78703280f" gracePeriod=30 Jan 30 21:38:37 crc kubenswrapper[4834]: I0130 21:38:37.961759 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.119447 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2hcn\" (UniqueName: \"kubernetes.io/projected/d5b5c426-8ce0-4431-8f98-a18bc07163ff-kube-api-access-w2hcn\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.120732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-config-data\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.120766 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-log-httpd\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.121217 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-combined-ca-bundle\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.121275 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-run-httpd\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.121388 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-scripts\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.121440 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-sg-core-conf-yaml\") pod \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\" (UID: \"d5b5c426-8ce0-4431-8f98-a18bc07163ff\") " Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.121484 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.121918 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.122127 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.125961 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b5c426-8ce0-4431-8f98-a18bc07163ff-kube-api-access-w2hcn" (OuterVolumeSpecName: "kube-api-access-w2hcn") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "kube-api-access-w2hcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.128167 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-scripts" (OuterVolumeSpecName: "scripts") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.158709 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.205329 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.209590 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-config-data" (OuterVolumeSpecName: "config-data") pod "d5b5c426-8ce0-4431-8f98-a18bc07163ff" (UID: "d5b5c426-8ce0-4431-8f98-a18bc07163ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.223167 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5b5c426-8ce0-4431-8f98-a18bc07163ff-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.223210 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.223227 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.223239 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.223250 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2hcn\" (UniqueName: \"kubernetes.io/projected/d5b5c426-8ce0-4431-8f98-a18bc07163ff-kube-api-access-w2hcn\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.223262 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b5c426-8ce0-4431-8f98-a18bc07163ff-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.829737 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"bda10687-cb12-404d-a99f-366f499918ec","Type":"ContainerStarted","Data":"8ba6439b106179c2a143a35b8933a7306c31c2f1b40e1654d60b240929122c55"} Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.835862 4834 generic.go:334] "Generic (PLEG): container finished" podID="4075b406-33bb-40e3-9429-087ba19fcb32" containerID="43decc1b912af4a1c561492b82cda3a337dd6d4896c7395e9fa11b341ae505d1" exitCode=143 Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.835976 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4075b406-33bb-40e3-9429-087ba19fcb32","Type":"ContainerDied","Data":"43decc1b912af4a1c561492b82cda3a337dd6d4896c7395e9fa11b341ae505d1"} Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.838473 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a"} Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.844067 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5b5c426-8ce0-4431-8f98-a18bc07163ff","Type":"ContainerDied","Data":"625489fb6106ae3063054779db61642f62845e171c658160c9ace9b6a61731e2"} Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.844124 4834 scope.go:117] "RemoveContainer" containerID="cdad460eac0011adb94718dc096811ea2034d25ba0ef1617f6648ab95ea4bad9" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.844253 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.857382 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.756642933 podStartE2EDuration="15.857363594s" podCreationTimestamp="2026-01-30 21:38:23 +0000 UTC" firstStartedPulling="2026-01-30 21:38:24.611797303 +0000 UTC m=+1355.764943451" lastFinishedPulling="2026-01-30 21:38:37.712517974 +0000 UTC m=+1368.865664112" observedRunningTime="2026-01-30 21:38:38.855585273 +0000 UTC m=+1370.008731461" watchObservedRunningTime="2026-01-30 21:38:38.857363594 +0000 UTC m=+1370.010509752" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.875085 4834 scope.go:117] "RemoveContainer" containerID="1e37564284014c67f25364cd245b64f6f167b667a5d1d6abb7b789f79149144d" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.900144 4834 scope.go:117] "RemoveContainer" containerID="b3d780c50a321a5a6f2fcd20f56ec4c15b04a7bb0d4028c54a47657da158a48d" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.911448 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.917512 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.936597 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.930422 4834 scope.go:117] "RemoveContainer" containerID="b5253a399f45f16b3687052f9a2ec21dfa1af7be25a435f31860dcc04d837d78" Jan 30 21:38:38 crc kubenswrapper[4834]: E0130 21:38:38.937177 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="sg-core" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.937250 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="sg-core" Jan 30 21:38:38 crc kubenswrapper[4834]: E0130 21:38:38.937325 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-central-agent" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.937376 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-central-agent" Jan 30 21:38:38 crc kubenswrapper[4834]: E0130 21:38:38.937464 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.937517 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api" Jan 30 21:38:38 crc kubenswrapper[4834]: E0130 21:38:38.937576 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="proxy-httpd" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.937625 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="proxy-httpd" Jan 30 21:38:38 crc kubenswrapper[4834]: E0130 21:38:38.937682 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-notification-agent" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.937731 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-notification-agent" Jan 30 21:38:38 crc kubenswrapper[4834]: E0130 21:38:38.937793 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api-log" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.937847 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api-log" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.938103 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.938166 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42b9899-e1fe-422f-af90-b08348a78694" containerName="barbican-api-log" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.938271 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="sg-core" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.938335 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-notification-agent" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.938490 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="ceilometer-central-agent" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.938546 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" containerName="proxy-httpd" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.940194 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.941964 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.942340 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:38:38 crc kubenswrapper[4834]: I0130 21:38:38.948650 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.043974 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cz5\" (UniqueName: \"kubernetes.io/projected/bd2c492a-77e9-46c6-8a7d-943e43cc7776-kube-api-access-75cz5\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.044076 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-log-httpd\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.044118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-config-data\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.044158 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-run-httpd\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.044183 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.044225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-scripts\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.044249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146280 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-log-httpd\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146345 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-config-data\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146379 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-run-httpd\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146416 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146453 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-scripts\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146474 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.146604 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cz5\" (UniqueName: \"kubernetes.io/projected/bd2c492a-77e9-46c6-8a7d-943e43cc7776-kube-api-access-75cz5\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.147470 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-log-httpd\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.148547 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-run-httpd\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.153035 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-scripts\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.153103 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.154474 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-config-data\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.164238 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.165307 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cz5\" (UniqueName: \"kubernetes.io/projected/bd2c492a-77e9-46c6-8a7d-943e43cc7776-kube-api-access-75cz5\") pod \"ceilometer-0\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.260670 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.262731 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5bb7f7bbcf-bjdrn" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.269196 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.604331 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b5c426-8ce0-4431-8f98-a18bc07163ff" path="/var/lib/kubelet/pods/d5b5c426-8ce0-4431-8f98-a18bc07163ff/volumes" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.605542 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.622513 4834 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8dfc4a01-3fc1-4360-bd1a-3643d5da2b05"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8dfc4a01-3fc1-4360-bd1a-3643d5da2b05] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8dfc4a01_3fc1_4360_bd1a_3643d5da2b05.slice" Jan 30 21:38:39 crc kubenswrapper[4834]: E0130 21:38:39.622589 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod8dfc4a01-3fc1-4360-bd1a-3643d5da2b05] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod8dfc4a01-3fc1-4360-bd1a-3643d5da2b05] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8dfc4a01_3fc1_4360_bd1a_3643d5da2b05.slice" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" podUID="8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.781218 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.857662 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerStarted","Data":"d8b8b79eb618ef6c8a1cdc54f55b1623e7f0df29e5c32940be2abaa90c7f1552"} Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.859010 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d66f584d7-qsvxj" Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.922689 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-qsvxj"] Jan 30 21:38:39 crc kubenswrapper[4834]: I0130 21:38:39.955118 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d66f584d7-qsvxj"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.122496 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9292/healthcheck\": read tcp 10.217.0.2:58504->10.217.0.164:9292: read: connection reset by peer" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.122577 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.164:9292/healthcheck\": read tcp 10.217.0.2:58512->10.217.0.164:9292: read: connection reset by peer" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.485449 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-v4rg6"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.486976 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.506568 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-v4rg6"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.584937 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqp5\" (UniqueName: \"kubernetes.io/projected/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-kube-api-access-jgqp5\") pod \"nova-api-db-create-v4rg6\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.585022 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-operator-scripts\") pod \"nova-api-db-create-v4rg6\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.687736 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqp5\" (UniqueName: \"kubernetes.io/projected/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-kube-api-access-jgqp5\") pod \"nova-api-db-create-v4rg6\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.687858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-operator-scripts\") pod \"nova-api-db-create-v4rg6\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.688813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-operator-scripts\") pod \"nova-api-db-create-v4rg6\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.691139 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-98df-account-create-update-cdwj4"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.692651 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.694763 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.712674 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqp5\" (UniqueName: \"kubernetes.io/projected/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-kube-api-access-jgqp5\") pod \"nova-api-db-create-v4rg6\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.716077 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.716464 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-98df-account-create-update-cdwj4"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.778879 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2dkk4"] Jan 30 21:38:40 crc kubenswrapper[4834]: E0130 21:38:40.779278 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-log" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.779297 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-log" Jan 30 21:38:40 crc kubenswrapper[4834]: E0130 21:38:40.779317 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-httpd" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.779324 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-httpd" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.779587 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-log" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.779679 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerName="glance-httpd" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.780289 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.802809 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2dkk4"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.850573 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.880579 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerStarted","Data":"c587ccd377f00643c81d87f7f684f83317b45bc37a318bfab347463f6869ad5f"} Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892730 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892783 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-logs\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892841 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-httpd-run\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892872 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtjcf\" (UniqueName: \"kubernetes.io/projected/e39cea01-f258-49de-a89e-380cc2ccdbb1-kube-api-access-rtjcf\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892908 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-combined-ca-bundle\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892951 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-config-data\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.892992 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-scripts\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.893022 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-public-tls-certs\") pod \"e39cea01-f258-49de-a89e-380cc2ccdbb1\" (UID: \"e39cea01-f258-49de-a89e-380cc2ccdbb1\") " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.893163 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wndv\" (UniqueName: \"kubernetes.io/projected/610bc5a1-2033-4f01-8ee9-e02c596fc94f-kube-api-access-2wndv\") pod \"nova-api-98df-account-create-update-cdwj4\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.893253 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc6106-f66d-4945-8ec9-91de63f2d579-operator-scripts\") pod \"nova-cell0-db-create-2dkk4\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.893277 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610bc5a1-2033-4f01-8ee9-e02c596fc94f-operator-scripts\") pod \"nova-api-98df-account-create-update-cdwj4\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.893294 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r599\" (UniqueName: \"kubernetes.io/projected/51bc6106-f66d-4945-8ec9-91de63f2d579-kube-api-access-4r599\") pod \"nova-cell0-db-create-2dkk4\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.893917 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-krbbm"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.894364 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.894635 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-logs" (OuterVolumeSpecName: "logs") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.896583 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.903497 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-scripts" (OuterVolumeSpecName: "scripts") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.903852 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.905357 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-krbbm"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.906438 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39cea01-f258-49de-a89e-380cc2ccdbb1-kube-api-access-rtjcf" (OuterVolumeSpecName: "kube-api-access-rtjcf") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "kube-api-access-rtjcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.917493 4834 generic.go:334] "Generic (PLEG): container finished" podID="e39cea01-f258-49de-a89e-380cc2ccdbb1" containerID="206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789" exitCode=0 Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.917537 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e39cea01-f258-49de-a89e-380cc2ccdbb1","Type":"ContainerDied","Data":"206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789"} Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.917564 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e39cea01-f258-49de-a89e-380cc2ccdbb1","Type":"ContainerDied","Data":"e023551f0f37caf7dd8dfce3bd6f3e40a5cef1c015da9916a56a5a26849cf884"} Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.917580 4834 scope.go:117] "RemoveContainer" containerID="206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.917913 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.960683 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.962708 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-35d8-account-create-update-7rznt"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.963935 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.967466 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.989562 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-35d8-account-create-update-7rznt"] Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998379 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf239124-75b5-4aa0-ade1-af07156f6b14-operator-scripts\") pod \"nova-cell0-35d8-account-create-update-7rznt\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998447 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8h7r\" (UniqueName: \"kubernetes.io/projected/8547e76b-acd4-45ea-a2ba-308e26be62b5-kube-api-access-b8h7r\") pod \"nova-cell1-db-create-krbbm\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998532 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8547e76b-acd4-45ea-a2ba-308e26be62b5-operator-scripts\") pod \"nova-cell1-db-create-krbbm\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998592 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wndv\" (UniqueName: \"kubernetes.io/projected/610bc5a1-2033-4f01-8ee9-e02c596fc94f-kube-api-access-2wndv\") pod \"nova-api-98df-account-create-update-cdwj4\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvtxm\" (UniqueName: \"kubernetes.io/projected/bf239124-75b5-4aa0-ade1-af07156f6b14-kube-api-access-xvtxm\") pod \"nova-cell0-35d8-account-create-update-7rznt\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998789 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc6106-f66d-4945-8ec9-91de63f2d579-operator-scripts\") pod \"nova-cell0-db-create-2dkk4\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610bc5a1-2033-4f01-8ee9-e02c596fc94f-operator-scripts\") pod \"nova-api-98df-account-create-update-cdwj4\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998863 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r599\" (UniqueName: \"kubernetes.io/projected/51bc6106-f66d-4945-8ec9-91de63f2d579-kube-api-access-4r599\") pod \"nova-cell0-db-create-2dkk4\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998977 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.998992 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.999001 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e39cea01-f258-49de-a89e-380cc2ccdbb1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.999011 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtjcf\" (UniqueName: \"kubernetes.io/projected/e39cea01-f258-49de-a89e-380cc2ccdbb1-kube-api-access-rtjcf\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.999021 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:40 crc kubenswrapper[4834]: I0130 21:38:40.999029 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.000250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc6106-f66d-4945-8ec9-91de63f2d579-operator-scripts\") pod \"nova-cell0-db-create-2dkk4\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.000300 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610bc5a1-2033-4f01-8ee9-e02c596fc94f-operator-scripts\") pod \"nova-api-98df-account-create-update-cdwj4\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.023041 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.024278 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.025562 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wndv\" (UniqueName: \"kubernetes.io/projected/610bc5a1-2033-4f01-8ee9-e02c596fc94f-kube-api-access-2wndv\") pod \"nova-api-98df-account-create-update-cdwj4\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.033956 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r599\" (UniqueName: \"kubernetes.io/projected/51bc6106-f66d-4945-8ec9-91de63f2d579-kube-api-access-4r599\") pod \"nova-cell0-db-create-2dkk4\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.072827 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-config-data" (OuterVolumeSpecName: "config-data") pod "e39cea01-f258-49de-a89e-380cc2ccdbb1" (UID: "e39cea01-f258-49de-a89e-380cc2ccdbb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099570 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvtxm\" (UniqueName: \"kubernetes.io/projected/bf239124-75b5-4aa0-ade1-af07156f6b14-kube-api-access-xvtxm\") pod \"nova-cell0-35d8-account-create-update-7rznt\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf239124-75b5-4aa0-ade1-af07156f6b14-operator-scripts\") pod \"nova-cell0-35d8-account-create-update-7rznt\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8h7r\" (UniqueName: \"kubernetes.io/projected/8547e76b-acd4-45ea-a2ba-308e26be62b5-kube-api-access-b8h7r\") pod \"nova-cell1-db-create-krbbm\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099718 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8547e76b-acd4-45ea-a2ba-308e26be62b5-operator-scripts\") pod \"nova-cell1-db-create-krbbm\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099799 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099811 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.099820 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e39cea01-f258-49de-a89e-380cc2ccdbb1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.101264 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf239124-75b5-4aa0-ade1-af07156f6b14-operator-scripts\") pod \"nova-cell0-35d8-account-create-update-7rznt\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.102135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8547e76b-acd4-45ea-a2ba-308e26be62b5-operator-scripts\") pod \"nova-cell1-db-create-krbbm\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.104533 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2910-account-create-update-wz247"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.106240 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.116708 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.129033 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvtxm\" (UniqueName: \"kubernetes.io/projected/bf239124-75b5-4aa0-ade1-af07156f6b14-kube-api-access-xvtxm\") pod \"nova-cell0-35d8-account-create-update-7rznt\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.134007 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2910-account-create-update-wz247"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.137616 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.163:9292/healthcheck\": read tcp 10.217.0.2:39324->10.217.0.163:9292: read: connection reset by peer" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.137912 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.163:9292/healthcheck\": read tcp 10.217.0.2:39308->10.217.0.163:9292: read: connection reset by peer" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.139966 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8h7r\" (UniqueName: \"kubernetes.io/projected/8547e76b-acd4-45ea-a2ba-308e26be62b5-kube-api-access-b8h7r\") pod \"nova-cell1-db-create-krbbm\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.144055 4834 scope.go:117] "RemoveContainer" containerID="39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.151151 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.167562 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.301780 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.303894 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2scg\" (UniqueName: \"kubernetes.io/projected/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-kube-api-access-r2scg\") pod \"nova-cell1-2910-account-create-update-wz247\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.303956 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-operator-scripts\") pod \"nova-cell1-2910-account-create-update-wz247\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.310585 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.315738 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.349804 4834 scope.go:117] "RemoveContainer" containerID="206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789" Jan 30 21:38:41 crc kubenswrapper[4834]: E0130 21:38:41.354700 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789\": container with ID starting with 206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789 not found: ID does not exist" containerID="206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.354751 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789"} err="failed to get container status \"206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789\": rpc error: code = NotFound desc = could not find container \"206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789\": container with ID starting with 206ea6386aadfb53c01d5044ffc5763e2657436a44775aa5a623b85acaafd789 not found: ID does not exist" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.354777 4834 scope.go:117] "RemoveContainer" containerID="39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.358916 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.360416 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: E0130 21:38:41.372320 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122\": container with ID starting with 39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122 not found: ID does not exist" containerID="39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.372352 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122"} err="failed to get container status \"39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122\": rpc error: code = NotFound desc = could not find container \"39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122\": container with ID starting with 39170f5510c3802f6e3d477c9dcb64e58239b6495b6ad87565ff4b9b1a16d122 not found: ID does not exist" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.372567 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.372739 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.407194 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2scg\" (UniqueName: \"kubernetes.io/projected/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-kube-api-access-r2scg\") pod \"nova-cell1-2910-account-create-update-wz247\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.407242 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-operator-scripts\") pod \"nova-cell1-2910-account-create-update-wz247\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.413870 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-operator-scripts\") pod \"nova-cell1-2910-account-create-update-wz247\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.432431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2scg\" (UniqueName: \"kubernetes.io/projected/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-kube-api-access-r2scg\") pod \"nova-cell1-2910-account-create-update-wz247\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.441372 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.455082 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.492971 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.555094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.555686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.565059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-scripts\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.565149 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/468b9ac4-33a0-4138-8c6e-a83db4ff688d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.565173 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-config-data\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.565221 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468b9ac4-33a0-4138-8c6e-a83db4ff688d-logs\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.565254 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.565293 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qthw5\" (UniqueName: \"kubernetes.io/projected/468b9ac4-33a0-4138-8c6e-a83db4ff688d-kube-api-access-qthw5\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.606781 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfc4a01-3fc1-4360-bd1a-3643d5da2b05" path="/var/lib/kubelet/pods/8dfc4a01-3fc1-4360-bd1a-3643d5da2b05/volumes" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.607695 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39cea01-f258-49de-a89e-380cc2ccdbb1" path="/var/lib/kubelet/pods/e39cea01-f258-49de-a89e-380cc2ccdbb1/volumes" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.615485 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-v4rg6"] Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679162 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/468b9ac4-33a0-4138-8c6e-a83db4ff688d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679463 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-config-data\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679566 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468b9ac4-33a0-4138-8c6e-a83db4ff688d-logs\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679664 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679743 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qthw5\" (UniqueName: \"kubernetes.io/projected/468b9ac4-33a0-4138-8c6e-a83db4ff688d-kube-api-access-qthw5\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679897 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.679980 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.680048 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-scripts\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.681330 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.685257 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/468b9ac4-33a0-4138-8c6e-a83db4ff688d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.685600 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468b9ac4-33a0-4138-8c6e-a83db4ff688d-logs\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.695756 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-scripts\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.697318 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-config-data\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.703740 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qthw5\" (UniqueName: \"kubernetes.io/projected/468b9ac4-33a0-4138-8c6e-a83db4ff688d-kube-api-access-qthw5\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.707050 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.708009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/468b9ac4-33a0-4138-8c6e-a83db4ff688d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.713611 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"468b9ac4-33a0-4138-8c6e-a83db4ff688d\") " pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.726189 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.952704 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerStarted","Data":"0acaf899a50ce379ef77f170d9a42da497ef0b4ba0530a8f53d13591ee97753d"} Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.965128 4834 generic.go:334] "Generic (PLEG): container finished" podID="4075b406-33bb-40e3-9429-087ba19fcb32" containerID="379f553bea4d38792a3b231c9d0af167a7bb371728d3a16701c761a78703280f" exitCode=0 Jan 30 21:38:41 crc kubenswrapper[4834]: I0130 21:38:41.965264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4075b406-33bb-40e3-9429-087ba19fcb32","Type":"ContainerDied","Data":"379f553bea4d38792a3b231c9d0af167a7bb371728d3a16701c761a78703280f"} Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.062686 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v4rg6" event={"ID":"70c39bab-7d70-4cf5-88cb-c7c61d3199a2","Type":"ContainerStarted","Data":"7e6444a0b6abb0edabfa3eb062969ad4892f3229737d07dd920eafb2b5254276"} Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.062729 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v4rg6" event={"ID":"70c39bab-7d70-4cf5-88cb-c7c61d3199a2","Type":"ContainerStarted","Data":"259a129e8ed7a466f7c5351169588e22e1be2e3115c248e872659f6cac49b63d"} Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.071384 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-35d8-account-create-update-7rznt"] Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.172537 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-v4rg6" podStartSLOduration=2.172514869 podStartE2EDuration="2.172514869s" podCreationTimestamp="2026-01-30 21:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:42.124070991 +0000 UTC m=+1373.277217129" watchObservedRunningTime="2026-01-30 21:38:42.172514869 +0000 UTC m=+1373.325661007" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.400480 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.487556 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2dkk4"] Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.521961 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-krbbm"] Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.533473 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-98df-account-create-update-cdwj4"] Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556611 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m26h\" (UniqueName: \"kubernetes.io/projected/4075b406-33bb-40e3-9429-087ba19fcb32-kube-api-access-9m26h\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556715 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-scripts\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556738 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-config-data\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-internal-tls-certs\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556774 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556852 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-combined-ca-bundle\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.556984 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-httpd-run\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.557038 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-logs\") pod \"4075b406-33bb-40e3-9429-087ba19fcb32\" (UID: \"4075b406-33bb-40e3-9429-087ba19fcb32\") " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.573503 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-scripts" (OuterVolumeSpecName: "scripts") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.578525 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.585563 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-logs" (OuterVolumeSpecName: "logs") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.586495 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.608651 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4075b406-33bb-40e3-9429-087ba19fcb32-kube-api-access-9m26h" (OuterVolumeSpecName: "kube-api-access-9m26h") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "kube-api-access-9m26h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.662685 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.663340 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.663384 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4075b406-33bb-40e3-9429-087ba19fcb32-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.663429 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m26h\" (UniqueName: \"kubernetes.io/projected/4075b406-33bb-40e3-9429-087ba19fcb32-kube-api-access-9m26h\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.663439 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.688747 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.723308 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.765213 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.765547 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.782843 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.785985 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-config-data" (OuterVolumeSpecName: "config-data") pod "4075b406-33bb-40e3-9429-087ba19fcb32" (UID: "4075b406-33bb-40e3-9429-087ba19fcb32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.867088 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.867115 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4075b406-33bb-40e3-9429-087ba19fcb32-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.868491 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2910-account-create-update-wz247"] Jan 30 21:38:42 crc kubenswrapper[4834]: I0130 21:38:42.956318 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.075418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-98df-account-create-update-cdwj4" event={"ID":"610bc5a1-2033-4f01-8ee9-e02c596fc94f","Type":"ContainerStarted","Data":"9401cd6a1ae2f99a0e3fe5bcb02927c3e58db1e18adbe686a9ed639734c48702"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.079845 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4075b406-33bb-40e3-9429-087ba19fcb32","Type":"ContainerDied","Data":"3cf0ab03c8105f4e5c3dc75fab9cc90350e6803d93da618c86c76fec018ca38d"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.080009 4834 scope.go:117] "RemoveContainer" containerID="379f553bea4d38792a3b231c9d0af167a7bb371728d3a16701c761a78703280f" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.080275 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.111081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2dkk4" event={"ID":"51bc6106-f66d-4945-8ec9-91de63f2d579","Type":"ContainerStarted","Data":"34f089620cdd871efeed60f345b16afe7296eb37e3f22d711c4d68c2b1f70783"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.111121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2dkk4" event={"ID":"51bc6106-f66d-4945-8ec9-91de63f2d579","Type":"ContainerStarted","Data":"4925bcef8cc3e5323e71d53e50bd3882b08a5b9876529d319c891cb3226f7900"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.117112 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"468b9ac4-33a0-4138-8c6e-a83db4ff688d","Type":"ContainerStarted","Data":"a4633f25d98df9329fc5d2e3af4f7db0ba2db30faf2635ff8eeaa1d963398df9"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.118499 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-krbbm" event={"ID":"8547e76b-acd4-45ea-a2ba-308e26be62b5","Type":"ContainerStarted","Data":"60f1416622a68111b676c34f5d878f452a6c64bf0657c640ccd0d8e3b1db4c65"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.129747 4834 generic.go:334] "Generic (PLEG): container finished" podID="70c39bab-7d70-4cf5-88cb-c7c61d3199a2" containerID="7e6444a0b6abb0edabfa3eb062969ad4892f3229737d07dd920eafb2b5254276" exitCode=0 Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.129810 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v4rg6" event={"ID":"70c39bab-7d70-4cf5-88cb-c7c61d3199a2","Type":"ContainerDied","Data":"7e6444a0b6abb0edabfa3eb062969ad4892f3229737d07dd920eafb2b5254276"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.142080 4834 scope.go:117] "RemoveContainer" containerID="43decc1b912af4a1c561492b82cda3a337dd6d4896c7395e9fa11b341ae505d1" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.145344 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2910-account-create-update-wz247" event={"ID":"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27","Type":"ContainerStarted","Data":"1b5ea804f255cbcfe7b2f1c747814daa1054ceb3bee030499917540fb53bcf13"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.155121 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.177112 4834 generic.go:334] "Generic (PLEG): container finished" podID="bf239124-75b5-4aa0-ade1-af07156f6b14" containerID="2d9cf2b23744cdea3e0b754b567ce60376c5e9a12a16a5b72d9b51e6a2a6bac2" exitCode=0 Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.177210 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" event={"ID":"bf239124-75b5-4aa0-ade1-af07156f6b14","Type":"ContainerDied","Data":"2d9cf2b23744cdea3e0b754b567ce60376c5e9a12a16a5b72d9b51e6a2a6bac2"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.177236 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" event={"ID":"bf239124-75b5-4aa0-ade1-af07156f6b14","Type":"ContainerStarted","Data":"56daeab9ed0571e7903cd452993a478a74cc6ace92f87026079526f0eb9d921a"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.179585 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-2dkk4" podStartSLOduration=3.179570427 podStartE2EDuration="3.179570427s" podCreationTimestamp="2026-01-30 21:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:43.145993219 +0000 UTC m=+1374.299139367" watchObservedRunningTime="2026-01-30 21:38:43.179570427 +0000 UTC m=+1374.332716565" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.179761 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.192794 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f4bb75569-jlmhp" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.211459 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerStarted","Data":"50fa03c4264e617d87d23189b361a3ae3f00f3191a4fb12f7170650f9a66b49e"} Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.260416 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:38:43 crc kubenswrapper[4834]: E0130 21:38:43.261387 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-log" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.261423 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-log" Jan 30 21:38:43 crc kubenswrapper[4834]: E0130 21:38:43.261477 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-httpd" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.261489 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-httpd" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.261830 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-log" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.261850 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" containerName="glance-httpd" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.291368 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.294326 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.326005 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.369237 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.395887 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.395944 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396013 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69203bf2-de86-4d46-873d-1061b074c7c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396060 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69203bf2-de86-4d46-873d-1061b074c7c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396093 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396153 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396238 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgsrq\" (UniqueName: \"kubernetes.io/projected/69203bf2-de86-4d46-873d-1061b074c7c8-kube-api-access-zgsrq\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396569 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6688b77c56-lfnbd"] Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.396761 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6688b77c56-lfnbd" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-api" containerID="cri-o://92649105945e5fae740063a5468a15769cb5de8f12043adf683338238c63010d" gracePeriod=30 Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.397055 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6688b77c56-lfnbd" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-httpd" containerID="cri-o://afd4cf863cc75c6550395bfed730ef3a3323248c97988c6a2e8a2dc82bd3cfd5" gracePeriod=30 Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498623 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69203bf2-de86-4d46-873d-1061b074c7c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498677 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69203bf2-de86-4d46-873d-1061b074c7c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498722 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498754 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgsrq\" (UniqueName: \"kubernetes.io/projected/69203bf2-de86-4d46-873d-1061b074c7c8-kube-api-access-zgsrq\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498851 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.498878 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.499143 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69203bf2-de86-4d46-873d-1061b074c7c8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.499571 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69203bf2-de86-4d46-873d-1061b074c7c8-logs\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.499923 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.517185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.517487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.517627 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.520190 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69203bf2-de86-4d46-873d-1061b074c7c8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.526278 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgsrq\" (UniqueName: \"kubernetes.io/projected/69203bf2-de86-4d46-873d-1061b074c7c8-kube-api-access-zgsrq\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.559854 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4075b406-33bb-40e3-9429-087ba19fcb32" path="/var/lib/kubelet/pods/4075b406-33bb-40e3-9429-087ba19fcb32/volumes" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.582503 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"69203bf2-de86-4d46-873d-1061b074c7c8\") " pod="openstack/glance-default-internal-api-0" Jan 30 21:38:43 crc kubenswrapper[4834]: I0130 21:38:43.647233 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.155072 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.164100 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8086ba30-2087-423e-835a-78c41737a883" containerName="kube-state-metrics" containerID="cri-o://416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d" gracePeriod=30 Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.241538 4834 generic.go:334] "Generic (PLEG): container finished" podID="51bc6106-f66d-4945-8ec9-91de63f2d579" containerID="34f089620cdd871efeed60f345b16afe7296eb37e3f22d711c4d68c2b1f70783" exitCode=0 Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.241611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2dkk4" event={"ID":"51bc6106-f66d-4945-8ec9-91de63f2d579","Type":"ContainerDied","Data":"34f089620cdd871efeed60f345b16afe7296eb37e3f22d711c4d68c2b1f70783"} Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.243810 4834 generic.go:334] "Generic (PLEG): container finished" podID="451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" containerID="13c38d96cb9a4a020b8857f220884dd914750856d66040a7e82963dbdde3f53f" exitCode=0 Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.243874 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2910-account-create-update-wz247" event={"ID":"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27","Type":"ContainerDied","Data":"13c38d96cb9a4a020b8857f220884dd914750856d66040a7e82963dbdde3f53f"} Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.251061 4834 generic.go:334] "Generic (PLEG): container finished" podID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerID="afd4cf863cc75c6550395bfed730ef3a3323248c97988c6a2e8a2dc82bd3cfd5" exitCode=0 Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.251123 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6688b77c56-lfnbd" event={"ID":"0067b767-3e5f-41d1-ba56-d64799b81a8c","Type":"ContainerDied","Data":"afd4cf863cc75c6550395bfed730ef3a3323248c97988c6a2e8a2dc82bd3cfd5"} Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.260894 4834 generic.go:334] "Generic (PLEG): container finished" podID="610bc5a1-2033-4f01-8ee9-e02c596fc94f" containerID="f283daa4b2072b17b52843506aefba9dd90c8c44b34da2157b5091272362becd" exitCode=0 Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.261156 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-98df-account-create-update-cdwj4" event={"ID":"610bc5a1-2033-4f01-8ee9-e02c596fc94f","Type":"ContainerDied","Data":"f283daa4b2072b17b52843506aefba9dd90c8c44b34da2157b5091272362becd"} Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.264413 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"468b9ac4-33a0-4138-8c6e-a83db4ff688d","Type":"ContainerStarted","Data":"6f949d5438780044ce86c6d1eebbcc99a8ec6f5576f896c6824556fa3f68a62b"} Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.268015 4834 generic.go:334] "Generic (PLEG): container finished" podID="8547e76b-acd4-45ea-a2ba-308e26be62b5" containerID="b539f7e809f078f0f410bcca6733678fc479b9adca829469beb7734a5beb229f" exitCode=0 Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.268477 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-krbbm" event={"ID":"8547e76b-acd4-45ea-a2ba-308e26be62b5","Type":"ContainerDied","Data":"b539f7e809f078f0f410bcca6733678fc479b9adca829469beb7734a5beb229f"} Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.335453 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.526158 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="8086ba30-2087-423e-835a-78c41737a883" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": dial tcp 10.217.0.117:8081: connect: connection refused" Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.854098 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.933478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-operator-scripts\") pod \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.933667 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgqp5\" (UniqueName: \"kubernetes.io/projected/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-kube-api-access-jgqp5\") pod \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\" (UID: \"70c39bab-7d70-4cf5-88cb-c7c61d3199a2\") " Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.934963 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70c39bab-7d70-4cf5-88cb-c7c61d3199a2" (UID: "70c39bab-7d70-4cf5-88cb-c7c61d3199a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:44 crc kubenswrapper[4834]: I0130 21:38:44.949386 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-kube-api-access-jgqp5" (OuterVolumeSpecName: "kube-api-access-jgqp5") pod "70c39bab-7d70-4cf5-88cb-c7c61d3199a2" (UID: "70c39bab-7d70-4cf5-88cb-c7c61d3199a2"). InnerVolumeSpecName "kube-api-access-jgqp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.036222 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgqp5\" (UniqueName: \"kubernetes.io/projected/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-kube-api-access-jgqp5\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.036260 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70c39bab-7d70-4cf5-88cb-c7c61d3199a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.251978 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.281164 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.296696 4834 generic.go:334] "Generic (PLEG): container finished" podID="8086ba30-2087-423e-835a-78c41737a883" containerID="416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d" exitCode=2 Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.297048 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8086ba30-2087-423e-835a-78c41737a883","Type":"ContainerDied","Data":"416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d"} Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.297083 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8086ba30-2087-423e-835a-78c41737a883","Type":"ContainerDied","Data":"88603fbacf25d125b9cde470f1fcb6339439a3cb9c78ad2ad98474ed81764cf4"} Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.297106 4834 scope.go:117] "RemoveContainer" containerID="416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.297225 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.315701 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" event={"ID":"bf239124-75b5-4aa0-ade1-af07156f6b14","Type":"ContainerDied","Data":"56daeab9ed0571e7903cd452993a478a74cc6ace92f87026079526f0eb9d921a"} Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.315745 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56daeab9ed0571e7903cd452993a478a74cc6ace92f87026079526f0eb9d921a" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.315812 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-35d8-account-create-update-7rznt" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.322169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"468b9ac4-33a0-4138-8c6e-a83db4ff688d","Type":"ContainerStarted","Data":"098c665785e3d571bf4fa311bb64264bf8afc27d164d7ecba64d8cc4fd7c6a03"} Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.336489 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v4rg6" event={"ID":"70c39bab-7d70-4cf5-88cb-c7c61d3199a2","Type":"ContainerDied","Data":"259a129e8ed7a466f7c5351169588e22e1be2e3115c248e872659f6cac49b63d"} Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.336527 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259a129e8ed7a466f7c5351169588e22e1be2e3115c248e872659f6cac49b63d" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.336587 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v4rg6" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.341541 4834 scope.go:117] "RemoveContainer" containerID="416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.342450 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf239124-75b5-4aa0-ade1-af07156f6b14-operator-scripts\") pod \"bf239124-75b5-4aa0-ade1-af07156f6b14\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.342509 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvtxm\" (UniqueName: \"kubernetes.io/projected/bf239124-75b5-4aa0-ade1-af07156f6b14-kube-api-access-xvtxm\") pod \"bf239124-75b5-4aa0-ade1-af07156f6b14\" (UID: \"bf239124-75b5-4aa0-ade1-af07156f6b14\") " Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.342553 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcfqw\" (UniqueName: \"kubernetes.io/projected/8086ba30-2087-423e-835a-78c41737a883-kube-api-access-jcfqw\") pod \"8086ba30-2087-423e-835a-78c41737a883\" (UID: \"8086ba30-2087-423e-835a-78c41737a883\") " Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.347340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf239124-75b5-4aa0-ade1-af07156f6b14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf239124-75b5-4aa0-ade1-af07156f6b14" (UID: "bf239124-75b5-4aa0-ade1-af07156f6b14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:45 crc kubenswrapper[4834]: E0130 21:38:45.347856 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d\": container with ID starting with 416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d not found: ID does not exist" containerID="416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.347902 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d"} err="failed to get container status \"416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d\": rpc error: code = NotFound desc = could not find container \"416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d\": container with ID starting with 416bcec5429137c36a3b605d65a6d0ff4420e5825202febf9682227fa6ed307d not found: ID does not exist" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.349052 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69203bf2-de86-4d46-873d-1061b074c7c8","Type":"ContainerStarted","Data":"290e89671cdba44eb7c1fbe7f9da7439c64d0324365c6244d77d16a0c3564796"} Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.357914 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8086ba30-2087-423e-835a-78c41737a883-kube-api-access-jcfqw" (OuterVolumeSpecName: "kube-api-access-jcfqw") pod "8086ba30-2087-423e-835a-78c41737a883" (UID: "8086ba30-2087-423e-835a-78c41737a883"). InnerVolumeSpecName "kube-api-access-jcfqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.358048 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf239124-75b5-4aa0-ade1-af07156f6b14-kube-api-access-xvtxm" (OuterVolumeSpecName: "kube-api-access-xvtxm") pod "bf239124-75b5-4aa0-ade1-af07156f6b14" (UID: "bf239124-75b5-4aa0-ade1-af07156f6b14"). InnerVolumeSpecName "kube-api-access-xvtxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.362239 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.362219852 podStartE2EDuration="4.362219852s" podCreationTimestamp="2026-01-30 21:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:45.357714505 +0000 UTC m=+1376.510860643" watchObservedRunningTime="2026-01-30 21:38:45.362219852 +0000 UTC m=+1376.515365990" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.444774 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf239124-75b5-4aa0-ade1-af07156f6b14-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.444820 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvtxm\" (UniqueName: \"kubernetes.io/projected/bf239124-75b5-4aa0-ade1-af07156f6b14-kube-api-access-xvtxm\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.444834 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcfqw\" (UniqueName: \"kubernetes.io/projected/8086ba30-2087-423e-835a-78c41737a883-kube-api-access-jcfqw\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.653535 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.660162 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.689448 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:38:45 crc kubenswrapper[4834]: E0130 21:38:45.690444 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf239124-75b5-4aa0-ade1-af07156f6b14" containerName="mariadb-account-create-update" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.692007 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf239124-75b5-4aa0-ade1-af07156f6b14" containerName="mariadb-account-create-update" Jan 30 21:38:45 crc kubenswrapper[4834]: E0130 21:38:45.692107 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c39bab-7d70-4cf5-88cb-c7c61d3199a2" containerName="mariadb-database-create" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.692183 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c39bab-7d70-4cf5-88cb-c7c61d3199a2" containerName="mariadb-database-create" Jan 30 21:38:45 crc kubenswrapper[4834]: E0130 21:38:45.692271 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8086ba30-2087-423e-835a-78c41737a883" containerName="kube-state-metrics" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.692348 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8086ba30-2087-423e-835a-78c41737a883" containerName="kube-state-metrics" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.692598 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf239124-75b5-4aa0-ade1-af07156f6b14" containerName="mariadb-account-create-update" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.693107 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8086ba30-2087-423e-835a-78c41737a883" containerName="kube-state-metrics" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.693207 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c39bab-7d70-4cf5-88cb-c7c61d3199a2" containerName="mariadb-database-create" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.694089 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.695920 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.700840 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.701352 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.763119 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.763186 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.763266 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfggb\" (UniqueName: \"kubernetes.io/projected/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-api-access-sfggb\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.763371 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.865171 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.865261 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfggb\" (UniqueName: \"kubernetes.io/projected/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-api-access-sfggb\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.865372 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.865424 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.870993 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.876165 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.896048 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:45 crc kubenswrapper[4834]: I0130 21:38:45.900301 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfggb\" (UniqueName: \"kubernetes.io/projected/9fbc7d25-3b57-4ad8-af31-35cd316da312-kube-api-access-sfggb\") pod \"kube-state-metrics-0\" (UID: \"9fbc7d25-3b57-4ad8-af31-35cd316da312\") " pod="openstack/kube-state-metrics-0" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.013570 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.021060 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.082690 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wndv\" (UniqueName: \"kubernetes.io/projected/610bc5a1-2033-4f01-8ee9-e02c596fc94f-kube-api-access-2wndv\") pod \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.082737 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610bc5a1-2033-4f01-8ee9-e02c596fc94f-operator-scripts\") pod \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\" (UID: \"610bc5a1-2033-4f01-8ee9-e02c596fc94f\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.089020 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/610bc5a1-2033-4f01-8ee9-e02c596fc94f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "610bc5a1-2033-4f01-8ee9-e02c596fc94f" (UID: "610bc5a1-2033-4f01-8ee9-e02c596fc94f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.103570 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/610bc5a1-2033-4f01-8ee9-e02c596fc94f-kube-api-access-2wndv" (OuterVolumeSpecName: "kube-api-access-2wndv") pod "610bc5a1-2033-4f01-8ee9-e02c596fc94f" (UID: "610bc5a1-2033-4f01-8ee9-e02c596fc94f"). InnerVolumeSpecName "kube-api-access-2wndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.185056 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wndv\" (UniqueName: \"kubernetes.io/projected/610bc5a1-2033-4f01-8ee9-e02c596fc94f-kube-api-access-2wndv\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.185316 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610bc5a1-2033-4f01-8ee9-e02c596fc94f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.257144 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.261676 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.283062 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.362135 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-98df-account-create-update-cdwj4" event={"ID":"610bc5a1-2033-4f01-8ee9-e02c596fc94f","Type":"ContainerDied","Data":"9401cd6a1ae2f99a0e3fe5bcb02927c3e58db1e18adbe686a9ed639734c48702"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.362169 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-98df-account-create-update-cdwj4" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.362173 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9401cd6a1ae2f99a0e3fe5bcb02927c3e58db1e18adbe686a9ed639734c48702" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.365149 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-krbbm" event={"ID":"8547e76b-acd4-45ea-a2ba-308e26be62b5","Type":"ContainerDied","Data":"60f1416622a68111b676c34f5d878f452a6c64bf0657c640ccd0d8e3b1db4c65"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.365183 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f1416622a68111b676c34f5d878f452a6c64bf0657c640ccd0d8e3b1db4c65" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.365246 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-krbbm" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.372097 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2dkk4" event={"ID":"51bc6106-f66d-4945-8ec9-91de63f2d579","Type":"ContainerDied","Data":"4925bcef8cc3e5323e71d53e50bd3882b08a5b9876529d319c891cb3226f7900"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.372135 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4925bcef8cc3e5323e71d53e50bd3882b08a5b9876529d319c891cb3226f7900" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.372187 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2dkk4" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.378483 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69203bf2-de86-4d46-873d-1061b074c7c8","Type":"ContainerStarted","Data":"11a509a5f01d09926de80a8d804acea7b7873aec2808e17c4c0bca845f708411"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.390901 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8547e76b-acd4-45ea-a2ba-308e26be62b5-operator-scripts\") pod \"8547e76b-acd4-45ea-a2ba-308e26be62b5\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.391256 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2scg\" (UniqueName: \"kubernetes.io/projected/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-kube-api-access-r2scg\") pod \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.391344 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8h7r\" (UniqueName: \"kubernetes.io/projected/8547e76b-acd4-45ea-a2ba-308e26be62b5-kube-api-access-b8h7r\") pod \"8547e76b-acd4-45ea-a2ba-308e26be62b5\" (UID: \"8547e76b-acd4-45ea-a2ba-308e26be62b5\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.391424 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r599\" (UniqueName: \"kubernetes.io/projected/51bc6106-f66d-4945-8ec9-91de63f2d579-kube-api-access-4r599\") pod \"51bc6106-f66d-4945-8ec9-91de63f2d579\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.391538 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc6106-f66d-4945-8ec9-91de63f2d579-operator-scripts\") pod \"51bc6106-f66d-4945-8ec9-91de63f2d579\" (UID: \"51bc6106-f66d-4945-8ec9-91de63f2d579\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.391629 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-operator-scripts\") pod \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\" (UID: \"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.392871 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" (UID: "451d67a3-3dc8-4b25-9e88-b5e6e16fdb27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.393367 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8547e76b-acd4-45ea-a2ba-308e26be62b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8547e76b-acd4-45ea-a2ba-308e26be62b5" (UID: "8547e76b-acd4-45ea-a2ba-308e26be62b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.393939 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2910-account-create-update-wz247" event={"ID":"451d67a3-3dc8-4b25-9e88-b5e6e16fdb27","Type":"ContainerDied","Data":"1b5ea804f255cbcfe7b2f1c747814daa1054ceb3bee030499917540fb53bcf13"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.394071 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b5ea804f255cbcfe7b2f1c747814daa1054ceb3bee030499917540fb53bcf13" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.393986 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2910-account-create-update-wz247" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.394090 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51bc6106-f66d-4945-8ec9-91de63f2d579-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51bc6106-f66d-4945-8ec9-91de63f2d579" (UID: "51bc6106-f66d-4945-8ec9-91de63f2d579"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.404559 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8547e76b-acd4-45ea-a2ba-308e26be62b5-kube-api-access-b8h7r" (OuterVolumeSpecName: "kube-api-access-b8h7r") pod "8547e76b-acd4-45ea-a2ba-308e26be62b5" (UID: "8547e76b-acd4-45ea-a2ba-308e26be62b5"). InnerVolumeSpecName "kube-api-access-b8h7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.407551 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51bc6106-f66d-4945-8ec9-91de63f2d579-kube-api-access-4r599" (OuterVolumeSpecName: "kube-api-access-4r599") pod "51bc6106-f66d-4945-8ec9-91de63f2d579" (UID: "51bc6106-f66d-4945-8ec9-91de63f2d579"). InnerVolumeSpecName "kube-api-access-4r599". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.410649 4834 generic.go:334] "Generic (PLEG): container finished" podID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerID="92649105945e5fae740063a5468a15769cb5de8f12043adf683338238c63010d" exitCode=0 Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.410762 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6688b77c56-lfnbd" event={"ID":"0067b767-3e5f-41d1-ba56-d64799b81a8c","Type":"ContainerDied","Data":"92649105945e5fae740063a5468a15769cb5de8f12043adf683338238c63010d"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.413932 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-kube-api-access-r2scg" (OuterVolumeSpecName: "kube-api-access-r2scg") pod "451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" (UID: "451d67a3-3dc8-4b25-9e88-b5e6e16fdb27"). InnerVolumeSpecName "kube-api-access-r2scg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.415180 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerStarted","Data":"6997e47777b5a3cda9b3c9365df2e0d294bf58d55f3921034f850c89c429a481"} Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.415315 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-central-agent" containerID="cri-o://c587ccd377f00643c81d87f7f684f83317b45bc37a318bfab347463f6869ad5f" gracePeriod=30 Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.415432 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="proxy-httpd" containerID="cri-o://6997e47777b5a3cda9b3c9365df2e0d294bf58d55f3921034f850c89c429a481" gracePeriod=30 Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.415470 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="sg-core" containerID="cri-o://50fa03c4264e617d87d23189b361a3ae3f00f3191a4fb12f7170650f9a66b49e" gracePeriod=30 Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.415522 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-notification-agent" containerID="cri-o://0acaf899a50ce379ef77f170d9a42da497ef0b4ba0530a8f53d13591ee97753d" gracePeriod=30 Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.415633 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.447119 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.160682756 podStartE2EDuration="8.447102128s" podCreationTimestamp="2026-01-30 21:38:38 +0000 UTC" firstStartedPulling="2026-01-30 21:38:39.791008388 +0000 UTC m=+1370.944154526" lastFinishedPulling="2026-01-30 21:38:45.07742776 +0000 UTC m=+1376.230573898" observedRunningTime="2026-01-30 21:38:46.434351008 +0000 UTC m=+1377.587497146" watchObservedRunningTime="2026-01-30 21:38:46.447102128 +0000 UTC m=+1377.600248266" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.494811 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8h7r\" (UniqueName: \"kubernetes.io/projected/8547e76b-acd4-45ea-a2ba-308e26be62b5-kube-api-access-b8h7r\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.495097 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r599\" (UniqueName: \"kubernetes.io/projected/51bc6106-f66d-4945-8ec9-91de63f2d579-kube-api-access-4r599\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.495117 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc6106-f66d-4945-8ec9-91de63f2d579-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.495132 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.495144 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8547e76b-acd4-45ea-a2ba-308e26be62b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.495157 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2scg\" (UniqueName: \"kubernetes.io/projected/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27-kube-api-access-r2scg\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: W0130 21:38:46.638266 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fbc7d25_3b57_4ad8_af31_35cd316da312.slice/crio-718d8ad46eb0f564246b871318417d75f0bdc525cb8ad537999da4e52cf91666 WatchSource:0}: Error finding container 718d8ad46eb0f564246b871318417d75f0bdc525cb8ad537999da4e52cf91666: Status 404 returned error can't find the container with id 718d8ad46eb0f564246b871318417d75f0bdc525cb8ad537999da4e52cf91666 Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.639336 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.712830 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.799981 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-config\") pod \"0067b767-3e5f-41d1-ba56-d64799b81a8c\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.800094 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-424j5\" (UniqueName: \"kubernetes.io/projected/0067b767-3e5f-41d1-ba56-d64799b81a8c-kube-api-access-424j5\") pod \"0067b767-3e5f-41d1-ba56-d64799b81a8c\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.800118 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-httpd-config\") pod \"0067b767-3e5f-41d1-ba56-d64799b81a8c\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.800216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-ovndb-tls-certs\") pod \"0067b767-3e5f-41d1-ba56-d64799b81a8c\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.800256 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-combined-ca-bundle\") pod \"0067b767-3e5f-41d1-ba56-d64799b81a8c\" (UID: \"0067b767-3e5f-41d1-ba56-d64799b81a8c\") " Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.806795 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0067b767-3e5f-41d1-ba56-d64799b81a8c-kube-api-access-424j5" (OuterVolumeSpecName: "kube-api-access-424j5") pod "0067b767-3e5f-41d1-ba56-d64799b81a8c" (UID: "0067b767-3e5f-41d1-ba56-d64799b81a8c"). InnerVolumeSpecName "kube-api-access-424j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.808364 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0067b767-3e5f-41d1-ba56-d64799b81a8c" (UID: "0067b767-3e5f-41d1-ba56-d64799b81a8c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.852603 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0067b767-3e5f-41d1-ba56-d64799b81a8c" (UID: "0067b767-3e5f-41d1-ba56-d64799b81a8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.857040 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-config" (OuterVolumeSpecName: "config") pod "0067b767-3e5f-41d1-ba56-d64799b81a8c" (UID: "0067b767-3e5f-41d1-ba56-d64799b81a8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.884161 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0067b767-3e5f-41d1-ba56-d64799b81a8c" (UID: "0067b767-3e5f-41d1-ba56-d64799b81a8c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.902614 4834 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.902646 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.902659 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.902668 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-424j5\" (UniqueName: \"kubernetes.io/projected/0067b767-3e5f-41d1-ba56-d64799b81a8c-kube-api-access-424j5\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:46 crc kubenswrapper[4834]: I0130 21:38:46.902681 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0067b767-3e5f-41d1-ba56-d64799b81a8c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.428281 4834 generic.go:334] "Generic (PLEG): container finished" podID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerID="6997e47777b5a3cda9b3c9365df2e0d294bf58d55f3921034f850c89c429a481" exitCode=0 Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.428548 4834 generic.go:334] "Generic (PLEG): container finished" podID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerID="50fa03c4264e617d87d23189b361a3ae3f00f3191a4fb12f7170650f9a66b49e" exitCode=2 Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.428556 4834 generic.go:334] "Generic (PLEG): container finished" podID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerID="0acaf899a50ce379ef77f170d9a42da497ef0b4ba0530a8f53d13591ee97753d" exitCode=0 Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.428374 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerDied","Data":"6997e47777b5a3cda9b3c9365df2e0d294bf58d55f3921034f850c89c429a481"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.428622 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerDied","Data":"50fa03c4264e617d87d23189b361a3ae3f00f3191a4fb12f7170650f9a66b49e"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.428661 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerDied","Data":"0acaf899a50ce379ef77f170d9a42da497ef0b4ba0530a8f53d13591ee97753d"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.430569 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69203bf2-de86-4d46-873d-1061b074c7c8","Type":"ContainerStarted","Data":"ea8e6f0feae774caea44a1bda68a8927c4cd4eee45af4a4479b6a51a39df6945"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.435347 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9fbc7d25-3b57-4ad8-af31-35cd316da312","Type":"ContainerStarted","Data":"447abd09525722d0fff0d8cf6e6adc10865d55b7c6197b45e25564bcd6cc8585"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.435418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9fbc7d25-3b57-4ad8-af31-35cd316da312","Type":"ContainerStarted","Data":"718d8ad46eb0f564246b871318417d75f0bdc525cb8ad537999da4e52cf91666"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.437896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6688b77c56-lfnbd" event={"ID":"0067b767-3e5f-41d1-ba56-d64799b81a8c","Type":"ContainerDied","Data":"4612a3bba0651f77622deea299f7faf63296bf37d6159fbd1b12258b776b22f0"} Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.437938 4834 scope.go:117] "RemoveContainer" containerID="afd4cf863cc75c6550395bfed730ef3a3323248c97988c6a2e8a2dc82bd3cfd5" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.438040 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6688b77c56-lfnbd" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.463733 4834 scope.go:117] "RemoveContainer" containerID="92649105945e5fae740063a5468a15769cb5de8f12043adf683338238c63010d" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.472351 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.472333408 podStartE2EDuration="4.472333408s" podCreationTimestamp="2026-01-30 21:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:38:47.462691596 +0000 UTC m=+1378.615837734" watchObservedRunningTime="2026-01-30 21:38:47.472333408 +0000 UTC m=+1378.625479546" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.504485 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.068733093 podStartE2EDuration="2.504464386s" podCreationTimestamp="2026-01-30 21:38:45 +0000 UTC" firstStartedPulling="2026-01-30 21:38:46.64051569 +0000 UTC m=+1377.793661828" lastFinishedPulling="2026-01-30 21:38:47.076246983 +0000 UTC m=+1378.229393121" observedRunningTime="2026-01-30 21:38:47.491043347 +0000 UTC m=+1378.644189485" watchObservedRunningTime="2026-01-30 21:38:47.504464386 +0000 UTC m=+1378.657610524" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.558476 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8086ba30-2087-423e-835a-78c41737a883" path="/var/lib/kubelet/pods/8086ba30-2087-423e-835a-78c41737a883/volumes" Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.559003 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6688b77c56-lfnbd"] Jan 30 21:38:47 crc kubenswrapper[4834]: I0130 21:38:47.560677 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6688b77c56-lfnbd"] Jan 30 21:38:48 crc kubenswrapper[4834]: I0130 21:38:48.451455 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 30 21:38:49 crc kubenswrapper[4834]: I0130 21:38:49.547217 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" path="/var/lib/kubelet/pods/0067b767-3e5f-41d1-ba56-d64799b81a8c/volumes" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.473708 4834 generic.go:334] "Generic (PLEG): container finished" podID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerID="c587ccd377f00643c81d87f7f684f83317b45bc37a318bfab347463f6869ad5f" exitCode=0 Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.474096 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerDied","Data":"c587ccd377f00643c81d87f7f684f83317b45bc37a318bfab347463f6869ad5f"} Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.718276 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787190 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-combined-ca-bundle\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787310 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-log-httpd\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787340 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-config-data\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787372 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-scripts\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787511 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75cz5\" (UniqueName: \"kubernetes.io/projected/bd2c492a-77e9-46c6-8a7d-943e43cc7776-kube-api-access-75cz5\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787598 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-run-httpd\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.787677 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-sg-core-conf-yaml\") pod \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\" (UID: \"bd2c492a-77e9-46c6-8a7d-943e43cc7776\") " Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.788039 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.788105 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.804285 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-scripts" (OuterVolumeSpecName: "scripts") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.810294 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.810356 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd2c492a-77e9-46c6-8a7d-943e43cc7776-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.810373 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.827088 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2c492a-77e9-46c6-8a7d-943e43cc7776-kube-api-access-75cz5" (OuterVolumeSpecName: "kube-api-access-75cz5") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "kube-api-access-75cz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.850885 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.892651 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-config-data" (OuterVolumeSpecName: "config-data") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.900115 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd2c492a-77e9-46c6-8a7d-943e43cc7776" (UID: "bd2c492a-77e9-46c6-8a7d-943e43cc7776"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.912670 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.912715 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.912731 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75cz5\" (UniqueName: \"kubernetes.io/projected/bd2c492a-77e9-46c6-8a7d-943e43cc7776-kube-api-access-75cz5\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:50 crc kubenswrapper[4834]: I0130 21:38:50.912743 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd2c492a-77e9-46c6-8a7d-943e43cc7776-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.225793 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7sj55"] Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226475 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8547e76b-acd4-45ea-a2ba-308e26be62b5" containerName="mariadb-database-create" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226493 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8547e76b-acd4-45ea-a2ba-308e26be62b5" containerName="mariadb-database-create" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226509 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" containerName="mariadb-account-create-update" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226516 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" containerName="mariadb-account-create-update" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226527 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-httpd" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226533 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-httpd" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226540 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-central-agent" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226546 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-central-agent" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226560 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="sg-core" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226566 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="sg-core" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226576 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610bc5a1-2033-4f01-8ee9-e02c596fc94f" containerName="mariadb-account-create-update" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226581 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="610bc5a1-2033-4f01-8ee9-e02c596fc94f" containerName="mariadb-account-create-update" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226596 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-api" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226604 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-api" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226622 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-notification-agent" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226632 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-notification-agent" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226643 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="proxy-httpd" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226650 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="proxy-httpd" Jan 30 21:38:51 crc kubenswrapper[4834]: E0130 21:38:51.226676 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51bc6106-f66d-4945-8ec9-91de63f2d579" containerName="mariadb-database-create" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226684 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="51bc6106-f66d-4945-8ec9-91de63f2d579" containerName="mariadb-database-create" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226880 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="51bc6106-f66d-4945-8ec9-91de63f2d579" containerName="mariadb-database-create" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226893 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="proxy-httpd" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226906 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="610bc5a1-2033-4f01-8ee9-e02c596fc94f" containerName="mariadb-account-create-update" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226921 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="sg-core" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226933 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-api" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226947 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" containerName="mariadb-account-create-update" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226961 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0067b767-3e5f-41d1-ba56-d64799b81a8c" containerName="neutron-httpd" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226973 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8547e76b-acd4-45ea-a2ba-308e26be62b5" containerName="mariadb-database-create" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226985 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-notification-agent" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.226993 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" containerName="ceilometer-central-agent" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.227675 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.231146 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.231427 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.232417 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-266c5" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.243269 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7sj55"] Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.320757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-scripts\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.320821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.321221 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-config-data\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.321386 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdf7\" (UniqueName: \"kubernetes.io/projected/665b0f39-b1d4-431f-8741-90c0b6d31d52-kube-api-access-ftdf7\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.422785 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-config-data\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.422843 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdf7\" (UniqueName: \"kubernetes.io/projected/665b0f39-b1d4-431f-8741-90c0b6d31d52-kube-api-access-ftdf7\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.422890 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-scripts\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.422929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.426880 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-scripts\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.429125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.438440 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-config-data\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.457010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdf7\" (UniqueName: \"kubernetes.io/projected/665b0f39-b1d4-431f-8741-90c0b6d31d52-kube-api-access-ftdf7\") pod \"nova-cell0-conductor-db-sync-7sj55\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.493018 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.492894 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd2c492a-77e9-46c6-8a7d-943e43cc7776","Type":"ContainerDied","Data":"d8b8b79eb618ef6c8a1cdc54f55b1623e7f0df29e5c32940be2abaa90c7f1552"} Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.497478 4834 scope.go:117] "RemoveContainer" containerID="6997e47777b5a3cda9b3c9365df2e0d294bf58d55f3921034f850c89c429a481" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.543188 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.567641 4834 scope.go:117] "RemoveContainer" containerID="50fa03c4264e617d87d23189b361a3ae3f00f3191a4fb12f7170650f9a66b49e" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.590056 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.615608 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.624093 4834 scope.go:117] "RemoveContainer" containerID="0acaf899a50ce379ef77f170d9a42da497ef0b4ba0530a8f53d13591ee97753d" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.627052 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.629475 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.632756 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.632986 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.633099 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.651097 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.720829 4834 scope.go:117] "RemoveContainer" containerID="c587ccd377f00643c81d87f7f684f83317b45bc37a318bfab347463f6869ad5f" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.732795 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.732842 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.732928 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733076 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733121 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733147 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-config-data\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733194 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-log-httpd\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733230 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5g8t\" (UniqueName: \"kubernetes.io/projected/fafa3570-6c06-47a3-afda-79a1062452ed-kube-api-access-m5g8t\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-scripts\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.733287 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-run-httpd\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.769717 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.784786 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.834603 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.834761 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.834843 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-config-data\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.834937 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-log-httpd\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.835022 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5g8t\" (UniqueName: \"kubernetes.io/projected/fafa3570-6c06-47a3-afda-79a1062452ed-kube-api-access-m5g8t\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.835122 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-scripts\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.835186 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-run-httpd\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.835261 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.835659 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-log-httpd\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.837452 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-run-httpd\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.839739 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.842422 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-config-data\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.850272 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-scripts\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.850535 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.852384 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:51 crc kubenswrapper[4834]: I0130 21:38:51.852921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5g8t\" (UniqueName: \"kubernetes.io/projected/fafa3570-6c06-47a3-afda-79a1062452ed-kube-api-access-m5g8t\") pod \"ceilometer-0\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " pod="openstack/ceilometer-0" Jan 30 21:38:52 crc kubenswrapper[4834]: I0130 21:38:52.004655 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:38:52 crc kubenswrapper[4834]: I0130 21:38:52.124249 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7sj55"] Jan 30 21:38:52 crc kubenswrapper[4834]: I0130 21:38:52.517724 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7sj55" event={"ID":"665b0f39-b1d4-431f-8741-90c0b6d31d52","Type":"ContainerStarted","Data":"48340eb894a004c8dd15966e0a606ed65e99917ab2dddfa87a1597846444a64a"} Jan 30 21:38:52 crc kubenswrapper[4834]: I0130 21:38:52.518116 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:38:52 crc kubenswrapper[4834]: I0130 21:38:52.518141 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 30 21:38:52 crc kubenswrapper[4834]: I0130 21:38:52.605706 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.498080 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.551615 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2c492a-77e9-46c6-8a7d-943e43cc7776" path="/var/lib/kubelet/pods/bd2c492a-77e9-46c6-8a7d-943e43cc7776/volumes" Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.552794 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerStarted","Data":"bb55cab1192d1792936202539ace31ed17430c9bd2639b4b38752b99acfc8892"} Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.552831 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerStarted","Data":"174b3faaabbd2566b5f9a4a86e442213d49385d2a5dece2366ffbb06ac836df1"} Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.647846 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.647898 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.709627 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:53 crc kubenswrapper[4834]: I0130 21:38:53.740349 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.550096 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerStarted","Data":"e28acc9c1e4817836d5543db7117e4396c7112aa85f5570f6ade46093b2feee6"} Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.550474 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.550494 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.550247 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.550523 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.903828 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:38:54 crc kubenswrapper[4834]: I0130 21:38:54.906112 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 30 21:38:55 crc kubenswrapper[4834]: I0130 21:38:55.565655 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerStarted","Data":"0486912439aa342c5a076ccb76600a914e085bc3c5d1e088bee3eb78712c9d29"} Jan 30 21:38:56 crc kubenswrapper[4834]: I0130 21:38:56.073771 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 30 21:38:56 crc kubenswrapper[4834]: I0130 21:38:56.584174 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:38:56 crc kubenswrapper[4834]: I0130 21:38:56.584202 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 30 21:38:56 crc kubenswrapper[4834]: I0130 21:38:56.806495 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:38:56 crc kubenswrapper[4834]: I0130 21:38:56.808067 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.652905 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerStarted","Data":"c97e8cdfa01d0bb281b88ac1712646d6ee18d6bc794750bcf01e43f25d0e848a"} Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.653727 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.653388 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="sg-core" containerID="cri-o://0486912439aa342c5a076ccb76600a914e085bc3c5d1e088bee3eb78712c9d29" gracePeriod=30 Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.653211 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-central-agent" containerID="cri-o://bb55cab1192d1792936202539ace31ed17430c9bd2639b4b38752b99acfc8892" gracePeriod=30 Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.653588 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-notification-agent" containerID="cri-o://e28acc9c1e4817836d5543db7117e4396c7112aa85f5570f6ade46093b2feee6" gracePeriod=30 Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.653456 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="proxy-httpd" containerID="cri-o://c97e8cdfa01d0bb281b88ac1712646d6ee18d6bc794750bcf01e43f25d0e848a" gracePeriod=30 Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.657856 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7sj55" event={"ID":"665b0f39-b1d4-431f-8741-90c0b6d31d52","Type":"ContainerStarted","Data":"48a941c4cb471658f4609af2ce965991f76cb6cc26499a54ecacbe135e58b6ed"} Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.706536 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.331284988 podStartE2EDuration="11.706512231s" podCreationTimestamp="2026-01-30 21:38:51 +0000 UTC" firstStartedPulling="2026-01-30 21:38:52.614321582 +0000 UTC m=+1383.767467730" lastFinishedPulling="2026-01-30 21:39:01.989548835 +0000 UTC m=+1393.142694973" observedRunningTime="2026-01-30 21:39:02.694313936 +0000 UTC m=+1393.847460104" watchObservedRunningTime="2026-01-30 21:39:02.706512231 +0000 UTC m=+1393.859658389" Jan 30 21:39:02 crc kubenswrapper[4834]: I0130 21:39:02.719988 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-7sj55" podStartSLOduration=1.8193446720000002 podStartE2EDuration="11.719970351s" podCreationTimestamp="2026-01-30 21:38:51 +0000 UTC" firstStartedPulling="2026-01-30 21:38:52.131018564 +0000 UTC m=+1383.284164712" lastFinishedPulling="2026-01-30 21:39:02.031644253 +0000 UTC m=+1393.184790391" observedRunningTime="2026-01-30 21:39:02.711671067 +0000 UTC m=+1393.864817205" watchObservedRunningTime="2026-01-30 21:39:02.719970351 +0000 UTC m=+1393.873116479" Jan 30 21:39:03 crc kubenswrapper[4834]: I0130 21:39:03.668904 4834 generic.go:334] "Generic (PLEG): container finished" podID="fafa3570-6c06-47a3-afda-79a1062452ed" containerID="c97e8cdfa01d0bb281b88ac1712646d6ee18d6bc794750bcf01e43f25d0e848a" exitCode=0 Jan 30 21:39:03 crc kubenswrapper[4834]: I0130 21:39:03.669228 4834 generic.go:334] "Generic (PLEG): container finished" podID="fafa3570-6c06-47a3-afda-79a1062452ed" containerID="0486912439aa342c5a076ccb76600a914e085bc3c5d1e088bee3eb78712c9d29" exitCode=2 Jan 30 21:39:03 crc kubenswrapper[4834]: I0130 21:39:03.669239 4834 generic.go:334] "Generic (PLEG): container finished" podID="fafa3570-6c06-47a3-afda-79a1062452ed" containerID="e28acc9c1e4817836d5543db7117e4396c7112aa85f5570f6ade46093b2feee6" exitCode=0 Jan 30 21:39:03 crc kubenswrapper[4834]: I0130 21:39:03.668971 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerDied","Data":"c97e8cdfa01d0bb281b88ac1712646d6ee18d6bc794750bcf01e43f25d0e848a"} Jan 30 21:39:03 crc kubenswrapper[4834]: I0130 21:39:03.669325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerDied","Data":"0486912439aa342c5a076ccb76600a914e085bc3c5d1e088bee3eb78712c9d29"} Jan 30 21:39:03 crc kubenswrapper[4834]: I0130 21:39:03.669339 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerDied","Data":"e28acc9c1e4817836d5543db7117e4396c7112aa85f5570f6ade46093b2feee6"} Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.680290 4834 generic.go:334] "Generic (PLEG): container finished" podID="fafa3570-6c06-47a3-afda-79a1062452ed" containerID="bb55cab1192d1792936202539ace31ed17430c9bd2639b4b38752b99acfc8892" exitCode=0 Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.680304 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerDied","Data":"bb55cab1192d1792936202539ace31ed17430c9bd2639b4b38752b99acfc8892"} Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.680795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fafa3570-6c06-47a3-afda-79a1062452ed","Type":"ContainerDied","Data":"174b3faaabbd2566b5f9a4a86e442213d49385d2a5dece2366ffbb06ac836df1"} Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.680811 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="174b3faaabbd2566b5f9a4a86e442213d49385d2a5dece2366ffbb06ac836df1" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.719059 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835048 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-log-httpd\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835117 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-combined-ca-bundle\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835165 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-sg-core-conf-yaml\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835244 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-config-data\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835309 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-ceilometer-tls-certs\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835411 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-scripts\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835447 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5g8t\" (UniqueName: \"kubernetes.io/projected/fafa3570-6c06-47a3-afda-79a1062452ed-kube-api-access-m5g8t\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835488 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-run-httpd\") pod \"fafa3570-6c06-47a3-afda-79a1062452ed\" (UID: \"fafa3570-6c06-47a3-afda-79a1062452ed\") " Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835500 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.835882 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.836095 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.863275 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-scripts" (OuterVolumeSpecName: "scripts") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.871035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fafa3570-6c06-47a3-afda-79a1062452ed-kube-api-access-m5g8t" (OuterVolumeSpecName: "kube-api-access-m5g8t") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "kube-api-access-m5g8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.886990 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.911867 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.937713 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.937754 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5g8t\" (UniqueName: \"kubernetes.io/projected/fafa3570-6c06-47a3-afda-79a1062452ed-kube-api-access-m5g8t\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.937768 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fafa3570-6c06-47a3-afda-79a1062452ed-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.937778 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.937788 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.939175 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:04 crc kubenswrapper[4834]: I0130 21:39:04.981739 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-config-data" (OuterVolumeSpecName: "config-data") pod "fafa3570-6c06-47a3-afda-79a1062452ed" (UID: "fafa3570-6c06-47a3-afda-79a1062452ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.039545 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.039593 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafa3570-6c06-47a3-afda-79a1062452ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.689342 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.712031 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.720602 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.754818 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:05 crc kubenswrapper[4834]: E0130 21:39:05.755341 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="proxy-httpd" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755363 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="proxy-httpd" Jan 30 21:39:05 crc kubenswrapper[4834]: E0130 21:39:05.755388 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-central-agent" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755416 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-central-agent" Jan 30 21:39:05 crc kubenswrapper[4834]: E0130 21:39:05.755436 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-notification-agent" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755449 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-notification-agent" Jan 30 21:39:05 crc kubenswrapper[4834]: E0130 21:39:05.755489 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="sg-core" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755497 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="sg-core" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755726 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="sg-core" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755760 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-central-agent" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755772 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="proxy-httpd" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.755796 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" containerName="ceilometer-notification-agent" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.762383 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.766779 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.767116 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.768109 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.769277 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.833186 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:05 crc kubenswrapper[4834]: E0130 21:39:05.834023 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-64v28 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="03a734a8-4eae-4d5b-b92e-737a4b8e6187" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.854208 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.854274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-config-data\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.854528 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.854565 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-scripts\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.854597 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64v28\" (UniqueName: \"kubernetes.io/projected/03a734a8-4eae-4d5b-b92e-737a4b8e6187-kube-api-access-64v28\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.854685 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.855021 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-log-httpd\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.855065 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-run-httpd\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957043 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957090 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-config-data\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957161 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957183 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-scripts\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64v28\" (UniqueName: \"kubernetes.io/projected/03a734a8-4eae-4d5b-b92e-737a4b8e6187-kube-api-access-64v28\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957235 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957283 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-log-httpd\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957298 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-run-httpd\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.957759 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-run-httpd\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.958012 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-log-httpd\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.961998 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-scripts\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.968749 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.970276 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-config-data\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.970307 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.974099 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:05 crc kubenswrapper[4834]: I0130 21:39:05.977130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64v28\" (UniqueName: \"kubernetes.io/projected/03a734a8-4eae-4d5b-b92e-737a4b8e6187-kube-api-access-64v28\") pod \"ceilometer-0\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " pod="openstack/ceilometer-0" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.702287 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.715354 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800360 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-combined-ca-bundle\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800432 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-scripts\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800522 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-config-data\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800601 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-run-httpd\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800663 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-sg-core-conf-yaml\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800686 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-ceilometer-tls-certs\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800799 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64v28\" (UniqueName: \"kubernetes.io/projected/03a734a8-4eae-4d5b-b92e-737a4b8e6187-kube-api-access-64v28\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.800845 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-log-httpd\") pod \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\" (UID: \"03a734a8-4eae-4d5b-b92e-737a4b8e6187\") " Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.801317 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.801587 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.805557 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.806472 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-config-data" (OuterVolumeSpecName: "config-data") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.806826 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a734a8-4eae-4d5b-b92e-737a4b8e6187-kube-api-access-64v28" (OuterVolumeSpecName: "kube-api-access-64v28") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "kube-api-access-64v28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.807486 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.819180 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.820484 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-scripts" (OuterVolumeSpecName: "scripts") pod "03a734a8-4eae-4d5b-b92e-737a4b8e6187" (UID: "03a734a8-4eae-4d5b-b92e-737a4b8e6187"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.902922 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.902978 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.902990 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.902999 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.903010 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.903021 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/03a734a8-4eae-4d5b-b92e-737a4b8e6187-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.903030 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64v28\" (UniqueName: \"kubernetes.io/projected/03a734a8-4eae-4d5b-b92e-737a4b8e6187-kube-api-access-64v28\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:06 crc kubenswrapper[4834]: I0130 21:39:06.903042 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/03a734a8-4eae-4d5b-b92e-737a4b8e6187-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.543492 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fafa3570-6c06-47a3-afda-79a1062452ed" path="/var/lib/kubelet/pods/fafa3570-6c06-47a3-afda-79a1062452ed/volumes" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.710352 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.752036 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.780494 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.807145 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.810532 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.815840 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.815918 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.818968 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.824853 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923100 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923191 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-scripts\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923215 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-config-data\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923310 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-run-httpd\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923357 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923376 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ztz\" (UniqueName: \"kubernetes.io/projected/9aecf756-d98a-4d9e-9e19-e1497ae773d7-kube-api-access-l2ztz\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:07 crc kubenswrapper[4834]: I0130 21:39:07.923419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-log-httpd\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025335 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025661 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-scripts\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025687 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-config-data\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-run-httpd\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025882 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ztz\" (UniqueName: \"kubernetes.io/projected/9aecf756-d98a-4d9e-9e19-e1497ae773d7-kube-api-access-l2ztz\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.025917 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-log-httpd\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.026380 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-log-httpd\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.027209 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-run-httpd\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.031089 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.032054 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-scripts\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.032356 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-config-data\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.041944 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.043259 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ztz\" (UniqueName: \"kubernetes.io/projected/9aecf756-d98a-4d9e-9e19-e1497ae773d7-kube-api-access-l2ztz\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.045563 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.128853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.640552 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:39:08 crc kubenswrapper[4834]: W0130 21:39:08.641492 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aecf756_d98a_4d9e_9e19_e1497ae773d7.slice/crio-21cdc1cd5334ab4385962bad71044764a8957bd1878858e8fe32a42e3925cb90 WatchSource:0}: Error finding container 21cdc1cd5334ab4385962bad71044764a8957bd1878858e8fe32a42e3925cb90: Status 404 returned error can't find the container with id 21cdc1cd5334ab4385962bad71044764a8957bd1878858e8fe32a42e3925cb90 Jan 30 21:39:08 crc kubenswrapper[4834]: I0130 21:39:08.721223 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerStarted","Data":"21cdc1cd5334ab4385962bad71044764a8957bd1878858e8fe32a42e3925cb90"} Jan 30 21:39:09 crc kubenswrapper[4834]: I0130 21:39:09.562953 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a734a8-4eae-4d5b-b92e-737a4b8e6187" path="/var/lib/kubelet/pods/03a734a8-4eae-4d5b-b92e-737a4b8e6187/volumes" Jan 30 21:39:09 crc kubenswrapper[4834]: I0130 21:39:09.743675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerStarted","Data":"7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1"} Jan 30 21:39:10 crc kubenswrapper[4834]: I0130 21:39:10.756212 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerStarted","Data":"6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5"} Jan 30 21:39:11 crc kubenswrapper[4834]: I0130 21:39:11.782463 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerStarted","Data":"8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496"} Jan 30 21:39:14 crc kubenswrapper[4834]: I0130 21:39:14.831993 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerStarted","Data":"5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f"} Jan 30 21:39:14 crc kubenswrapper[4834]: I0130 21:39:14.832586 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:39:14 crc kubenswrapper[4834]: I0130 21:39:14.857504 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.891479925 podStartE2EDuration="7.857489279s" podCreationTimestamp="2026-01-30 21:39:07 +0000 UTC" firstStartedPulling="2026-01-30 21:39:08.644628545 +0000 UTC m=+1399.797774693" lastFinishedPulling="2026-01-30 21:39:13.610637909 +0000 UTC m=+1404.763784047" observedRunningTime="2026-01-30 21:39:14.855778001 +0000 UTC m=+1406.008924139" watchObservedRunningTime="2026-01-30 21:39:14.857489279 +0000 UTC m=+1406.010635417" Jan 30 21:39:21 crc kubenswrapper[4834]: I0130 21:39:21.903948 4834 generic.go:334] "Generic (PLEG): container finished" podID="665b0f39-b1d4-431f-8741-90c0b6d31d52" containerID="48a941c4cb471658f4609af2ce965991f76cb6cc26499a54ecacbe135e58b6ed" exitCode=0 Jan 30 21:39:21 crc kubenswrapper[4834]: I0130 21:39:21.904086 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7sj55" event={"ID":"665b0f39-b1d4-431f-8741-90c0b6d31d52","Type":"ContainerDied","Data":"48a941c4cb471658f4609af2ce965991f76cb6cc26499a54ecacbe135e58b6ed"} Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.276476 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.369462 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-scripts\") pod \"665b0f39-b1d4-431f-8741-90c0b6d31d52\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.369925 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-config-data\") pod \"665b0f39-b1d4-431f-8741-90c0b6d31d52\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.369954 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-combined-ca-bundle\") pod \"665b0f39-b1d4-431f-8741-90c0b6d31d52\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.369982 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftdf7\" (UniqueName: \"kubernetes.io/projected/665b0f39-b1d4-431f-8741-90c0b6d31d52-kube-api-access-ftdf7\") pod \"665b0f39-b1d4-431f-8741-90c0b6d31d52\" (UID: \"665b0f39-b1d4-431f-8741-90c0b6d31d52\") " Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.377928 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-scripts" (OuterVolumeSpecName: "scripts") pod "665b0f39-b1d4-431f-8741-90c0b6d31d52" (UID: "665b0f39-b1d4-431f-8741-90c0b6d31d52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.378698 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665b0f39-b1d4-431f-8741-90c0b6d31d52-kube-api-access-ftdf7" (OuterVolumeSpecName: "kube-api-access-ftdf7") pod "665b0f39-b1d4-431f-8741-90c0b6d31d52" (UID: "665b0f39-b1d4-431f-8741-90c0b6d31d52"). InnerVolumeSpecName "kube-api-access-ftdf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.415620 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-config-data" (OuterVolumeSpecName: "config-data") pod "665b0f39-b1d4-431f-8741-90c0b6d31d52" (UID: "665b0f39-b1d4-431f-8741-90c0b6d31d52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.415928 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "665b0f39-b1d4-431f-8741-90c0b6d31d52" (UID: "665b0f39-b1d4-431f-8741-90c0b6d31d52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.471796 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.471836 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.471849 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftdf7\" (UniqueName: \"kubernetes.io/projected/665b0f39-b1d4-431f-8741-90c0b6d31d52-kube-api-access-ftdf7\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.471858 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/665b0f39-b1d4-431f-8741-90c0b6d31d52-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.925960 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7sj55" event={"ID":"665b0f39-b1d4-431f-8741-90c0b6d31d52","Type":"ContainerDied","Data":"48340eb894a004c8dd15966e0a606ed65e99917ab2dddfa87a1597846444a64a"} Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.926018 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48340eb894a004c8dd15966e0a606ed65e99917ab2dddfa87a1597846444a64a" Jan 30 21:39:23 crc kubenswrapper[4834]: I0130 21:39:23.926078 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7sj55" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.040900 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 21:39:24 crc kubenswrapper[4834]: E0130 21:39:24.041829 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665b0f39-b1d4-431f-8741-90c0b6d31d52" containerName="nova-cell0-conductor-db-sync" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.041879 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="665b0f39-b1d4-431f-8741-90c0b6d31d52" containerName="nova-cell0-conductor-db-sync" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.042205 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="665b0f39-b1d4-431f-8741-90c0b6d31d52" containerName="nova-cell0-conductor-db-sync" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.045320 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.050791 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.050795 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-266c5" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.074733 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.082863 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58a830a-4400-44c9-be55-758c32d90ac4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.083044 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58a830a-4400-44c9-be55-758c32d90ac4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.083113 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68v22\" (UniqueName: \"kubernetes.io/projected/f58a830a-4400-44c9-be55-758c32d90ac4-kube-api-access-68v22\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.185081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58a830a-4400-44c9-be55-758c32d90ac4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.185147 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58a830a-4400-44c9-be55-758c32d90ac4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.185167 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68v22\" (UniqueName: \"kubernetes.io/projected/f58a830a-4400-44c9-be55-758c32d90ac4-kube-api-access-68v22\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.188776 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58a830a-4400-44c9-be55-758c32d90ac4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.188975 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58a830a-4400-44c9-be55-758c32d90ac4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.204552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68v22\" (UniqueName: \"kubernetes.io/projected/f58a830a-4400-44c9-be55-758c32d90ac4-kube-api-access-68v22\") pod \"nova-cell0-conductor-0\" (UID: \"f58a830a-4400-44c9-be55-758c32d90ac4\") " pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.364129 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.872155 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 30 21:39:24 crc kubenswrapper[4834]: W0130 21:39:24.873245 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf58a830a_4400_44c9_be55_758c32d90ac4.slice/crio-f7412be2d4bb9ccc98f5faf2f49d354378ba5b15dd38910374022017a7f3a1de WatchSource:0}: Error finding container f7412be2d4bb9ccc98f5faf2f49d354378ba5b15dd38910374022017a7f3a1de: Status 404 returned error can't find the container with id f7412be2d4bb9ccc98f5faf2f49d354378ba5b15dd38910374022017a7f3a1de Jan 30 21:39:24 crc kubenswrapper[4834]: I0130 21:39:24.937967 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f58a830a-4400-44c9-be55-758c32d90ac4","Type":"ContainerStarted","Data":"f7412be2d4bb9ccc98f5faf2f49d354378ba5b15dd38910374022017a7f3a1de"} Jan 30 21:39:25 crc kubenswrapper[4834]: I0130 21:39:25.952168 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f58a830a-4400-44c9-be55-758c32d90ac4","Type":"ContainerStarted","Data":"a9356197cde836f7a189d54f61c2c3be45354949aefcd156930fe64aaa6e3099"} Jan 30 21:39:25 crc kubenswrapper[4834]: I0130 21:39:25.952768 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:25 crc kubenswrapper[4834]: I0130 21:39:25.987897 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9878635660000001 podStartE2EDuration="1.987863566s" podCreationTimestamp="2026-01-30 21:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:25.980423936 +0000 UTC m=+1417.133570074" watchObservedRunningTime="2026-01-30 21:39:25.987863566 +0000 UTC m=+1417.141009734" Jan 30 21:39:29 crc kubenswrapper[4834]: I0130 21:39:29.851844 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vn2x"] Jan 30 21:39:29 crc kubenswrapper[4834]: I0130 21:39:29.857224 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:29 crc kubenswrapper[4834]: I0130 21:39:29.890058 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vn2x"] Jan 30 21:39:29 crc kubenswrapper[4834]: I0130 21:39:29.907102 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-545jl\" (UniqueName: \"kubernetes.io/projected/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-kube-api-access-545jl\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:29 crc kubenswrapper[4834]: I0130 21:39:29.907300 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-utilities\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:29 crc kubenswrapper[4834]: I0130 21:39:29.907388 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-catalog-content\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.009200 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-545jl\" (UniqueName: \"kubernetes.io/projected/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-kube-api-access-545jl\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.009340 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-utilities\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.009428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-catalog-content\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.010055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-catalog-content\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.010159 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-utilities\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.031935 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-545jl\" (UniqueName: \"kubernetes.io/projected/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-kube-api-access-545jl\") pod \"certified-operators-9vn2x\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.180510 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:30 crc kubenswrapper[4834]: I0130 21:39:30.647198 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vn2x"] Jan 30 21:39:30 crc kubenswrapper[4834]: W0130 21:39:30.667521 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee11cf6_ede3_449f_8ca4_dbf77c6e6323.slice/crio-84210eca0fd0fc14946c50a645fa9c8ce6a9822519d616dfc21c434068f58700 WatchSource:0}: Error finding container 84210eca0fd0fc14946c50a645fa9c8ce6a9822519d616dfc21c434068f58700: Status 404 returned error can't find the container with id 84210eca0fd0fc14946c50a645fa9c8ce6a9822519d616dfc21c434068f58700 Jan 30 21:39:31 crc kubenswrapper[4834]: I0130 21:39:31.004438 4834 generic.go:334] "Generic (PLEG): container finished" podID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerID="388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab" exitCode=0 Jan 30 21:39:31 crc kubenswrapper[4834]: I0130 21:39:31.004510 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerDied","Data":"388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab"} Jan 30 21:39:31 crc kubenswrapper[4834]: I0130 21:39:31.004818 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerStarted","Data":"84210eca0fd0fc14946c50a645fa9c8ce6a9822519d616dfc21c434068f58700"} Jan 30 21:39:32 crc kubenswrapper[4834]: I0130 21:39:32.018284 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerStarted","Data":"dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647"} Jan 30 21:39:34 crc kubenswrapper[4834]: I0130 21:39:34.403967 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.057208 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vq77c"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.060185 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.066417 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.075096 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.078415 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vq77c"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.113290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-scripts\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.113424 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-config-data\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.113548 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.113579 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdth9\" (UniqueName: \"kubernetes.io/projected/9628df07-52b2-4e8b-9b6b-a1964c59ea27-kube-api-access-wdth9\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.215310 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdth9\" (UniqueName: \"kubernetes.io/projected/9628df07-52b2-4e8b-9b6b-a1964c59ea27-kube-api-access-wdth9\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.215462 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-scripts\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.215515 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-config-data\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.215646 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.222977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-scripts\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.225720 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.241255 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-config-data\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.246165 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdth9\" (UniqueName: \"kubernetes.io/projected/9628df07-52b2-4e8b-9b6b-a1964c59ea27-kube-api-access-wdth9\") pod \"nova-cell0-cell-mapping-vq77c\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.338560 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.339894 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.344687 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.366872 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.386294 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.398923 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.406068 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.415207 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.422780 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-config-data\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.422819 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52w5h\" (UniqueName: \"kubernetes.io/projected/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-kube-api-access-52w5h\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.422863 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcngr\" (UniqueName: \"kubernetes.io/projected/5678883b-9274-4209-9214-ddfac32200a5-kube-api-access-hcngr\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.422892 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.422925 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.423022 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.425473 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.427614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.432950 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.449487 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.492592 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.526351 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcngr\" (UniqueName: \"kubernetes.io/projected/5678883b-9274-4209-9214-ddfac32200a5-kube-api-access-hcngr\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/039162bb-a262-4029-94ba-5682dc5a1b47-logs\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528079 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528167 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528231 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-config-data\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528324 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq2bw\" (UniqueName: \"kubernetes.io/projected/039162bb-a262-4029-94ba-5682dc5a1b47-kube-api-access-hq2bw\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528652 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528821 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-config-data\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.528840 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52w5h\" (UniqueName: \"kubernetes.io/projected/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-kube-api-access-52w5h\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.532930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.561222 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.563097 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-config-data\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.563382 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.567143 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcngr\" (UniqueName: \"kubernetes.io/projected/5678883b-9274-4209-9214-ddfac32200a5-kube-api-access-hcngr\") pod \"nova-scheduler-0\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.577522 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52w5h\" (UniqueName: \"kubernetes.io/projected/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-kube-api-access-52w5h\") pod \"nova-cell1-novncproxy-0\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.603309 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.604857 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.610005 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.626605 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-config-data\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632387 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632427 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632476 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9856\" (UniqueName: \"kubernetes.io/projected/023fb86d-a292-4a95-9411-a008f9e7e934-kube-api-access-t9856\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632565 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/039162bb-a262-4029-94ba-5682dc5a1b47-logs\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632627 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/023fb86d-a292-4a95-9411-a008f9e7e934-logs\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-config-data\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.632731 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq2bw\" (UniqueName: \"kubernetes.io/projected/039162bb-a262-4029-94ba-5682dc5a1b47-kube-api-access-hq2bw\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.635148 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/039162bb-a262-4029-94ba-5682dc5a1b47-logs\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.649375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-config-data\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.650822 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.656208 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq2bw\" (UniqueName: \"kubernetes.io/projected/039162bb-a262-4029-94ba-5682dc5a1b47-kube-api-access-hq2bw\") pod \"nova-api-0\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.657985 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-66tdb"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.660509 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.664640 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.704236 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-66tdb"] Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734667 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-config\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734776 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9856\" (UniqueName: \"kubernetes.io/projected/023fb86d-a292-4a95-9411-a008f9e7e934-kube-api-access-t9856\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734853 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swk76\" (UniqueName: \"kubernetes.io/projected/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-kube-api-access-swk76\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734917 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734937 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/023fb86d-a292-4a95-9411-a008f9e7e934-logs\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734975 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-svc\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.734992 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.735114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-config-data\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.735136 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.737835 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/023fb86d-a292-4a95-9411-a008f9e7e934-logs\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.747498 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.747934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-config-data\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.761357 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9856\" (UniqueName: \"kubernetes.io/projected/023fb86d-a292-4a95-9411-a008f9e7e934-kube-api-access-t9856\") pod \"nova-metadata-0\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " pod="openstack/nova-metadata-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.837286 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.837326 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-config\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.837935 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swk76\" (UniqueName: \"kubernetes.io/projected/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-kube-api-access-swk76\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.837980 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.838007 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.838025 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-svc\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.838123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.838736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.839484 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-config\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.839491 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.839990 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.840163 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-svc\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.871240 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swk76\" (UniqueName: \"kubernetes.io/projected/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-kube-api-access-swk76\") pod \"dnsmasq-dns-865f5d856f-66tdb\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.903946 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:39:35 crc kubenswrapper[4834]: I0130 21:39:35.941761 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.002582 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.105353 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vq77c"] Jan 30 21:39:36 crc kubenswrapper[4834]: W0130 21:39:36.115757 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9628df07_52b2_4e8b_9b6b_a1964c59ea27.slice/crio-7f41a115d4eb5f7a02ca0d140868723c9549177ebfbc339f957c589dc9807a1a WatchSource:0}: Error finding container 7f41a115d4eb5f7a02ca0d140868723c9549177ebfbc339f957c589dc9807a1a: Status 404 returned error can't find the container with id 7f41a115d4eb5f7a02ca0d140868723c9549177ebfbc339f957c589dc9807a1a Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.203572 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkmbc"] Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.205513 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.208269 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.210569 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.214097 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkmbc"] Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.271193 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.274521 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4t57\" (UniqueName: \"kubernetes.io/projected/8e513111-c687-4c45-8262-7ce559c7decf-kube-api-access-j4t57\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.274750 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-scripts\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.275019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-config-data\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.275086 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.376661 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4t57\" (UniqueName: \"kubernetes.io/projected/8e513111-c687-4c45-8262-7ce559c7decf-kube-api-access-j4t57\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.376801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-scripts\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.376895 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-config-data\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.376919 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.394268 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-scripts\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.403379 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-config-data\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.404310 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.423873 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4t57\" (UniqueName: \"kubernetes.io/projected/8e513111-c687-4c45-8262-7ce559c7decf-kube-api-access-j4t57\") pod \"nova-cell1-conductor-db-sync-mkmbc\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.499131 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.541196 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.650568 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.684511 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:39:36 crc kubenswrapper[4834]: I0130 21:39:36.755185 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-66tdb"] Jan 30 21:39:36 crc kubenswrapper[4834]: W0130 21:39:36.757069 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a9cd9a1_0f8e_4626_b69b_64872b7bc8d9.slice/crio-1a3b18332c7ef272bf8d8b9b07830e5644afcb32d1ef7d3fe5df7d0c4222cf64 WatchSource:0}: Error finding container 1a3b18332c7ef272bf8d8b9b07830e5644afcb32d1ef7d3fe5df7d0c4222cf64: Status 404 returned error can't find the container with id 1a3b18332c7ef272bf8d8b9b07830e5644afcb32d1ef7d3fe5df7d0c4222cf64 Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.124170 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5678883b-9274-4209-9214-ddfac32200a5","Type":"ContainerStarted","Data":"292aa66feff7af98f45ae0c4b2789b09d22f2b14861cbf373db09a389120437a"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.125184 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vq77c" event={"ID":"9628df07-52b2-4e8b-9b6b-a1964c59ea27","Type":"ContainerStarted","Data":"7f41a115d4eb5f7a02ca0d140868723c9549177ebfbc339f957c589dc9807a1a"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.127001 4834 generic.go:334] "Generic (PLEG): container finished" podID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerID="dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647" exitCode=0 Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.127034 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerDied","Data":"dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.128121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"039162bb-a262-4029-94ba-5682dc5a1b47","Type":"ContainerStarted","Data":"740b8ec3e8ca209dfbab7ca793d03a4c37a1f290209690f1c43292d0856a2ed4"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.129308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4","Type":"ContainerStarted","Data":"9466372a2b51d644eb075d4728225949fa9a03935fa926b94816372c2b1da0d1"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.131777 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" event={"ID":"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9","Type":"ContainerStarted","Data":"1a3b18332c7ef272bf8d8b9b07830e5644afcb32d1ef7d3fe5df7d0c4222cf64"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.132741 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"023fb86d-a292-4a95-9411-a008f9e7e934","Type":"ContainerStarted","Data":"9d593846b9b3e55355991d8e0488ae9467ac2959a1345caf3742adc36e70df64"} Jan 30 21:39:37 crc kubenswrapper[4834]: I0130 21:39:37.162510 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkmbc"] Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.161685 4834 generic.go:334] "Generic (PLEG): container finished" podID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerID="d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8" exitCode=0 Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.161900 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" event={"ID":"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9","Type":"ContainerDied","Data":"d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8"} Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.166743 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" event={"ID":"8e513111-c687-4c45-8262-7ce559c7decf","Type":"ContainerStarted","Data":"e41d8ddee34c397e7d949eb9a68e780c9a835c0c6f542f30abc6e93b93f3b631"} Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.166799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" event={"ID":"8e513111-c687-4c45-8262-7ce559c7decf","Type":"ContainerStarted","Data":"6888c3b018e772b38a8797c15c13770c6ac2d2ce53418128dca2ac8c42590b18"} Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.170918 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vq77c" event={"ID":"9628df07-52b2-4e8b-9b6b-a1964c59ea27","Type":"ContainerStarted","Data":"cdd3c241810cf3af99121cf9e569a6eeb1e998a40b2ce8c5e35d839e5362a59c"} Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.198934 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.236849 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vq77c" podStartSLOduration=3.236806699 podStartE2EDuration="3.236806699s" podCreationTimestamp="2026-01-30 21:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:38.209957021 +0000 UTC m=+1429.363103159" watchObservedRunningTime="2026-01-30 21:39:38.236806699 +0000 UTC m=+1429.389952837" Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.247794 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" podStartSLOduration=2.247776679 podStartE2EDuration="2.247776679s" podCreationTimestamp="2026-01-30 21:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:38.231318214 +0000 UTC m=+1429.384464352" watchObservedRunningTime="2026-01-30 21:39:38.247776679 +0000 UTC m=+1429.400922817" Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.875651 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:39:38 crc kubenswrapper[4834]: I0130 21:39:38.907848 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:45 crc kubenswrapper[4834]: I0130 21:39:45.237941 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" event={"ID":"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9","Type":"ContainerStarted","Data":"dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36"} Jan 30 21:39:45 crc kubenswrapper[4834]: I0130 21:39:45.238513 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:45 crc kubenswrapper[4834]: I0130 21:39:45.277311 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" podStartSLOduration=10.277285783 podStartE2EDuration="10.277285783s" podCreationTimestamp="2026-01-30 21:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:45.262284319 +0000 UTC m=+1436.415430487" watchObservedRunningTime="2026-01-30 21:39:45.277285783 +0000 UTC m=+1436.430431931" Jan 30 21:39:46 crc kubenswrapper[4834]: I0130 21:39:46.251414 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4","Type":"ContainerStarted","Data":"21c7fda79770720423c94ff154c9288f7e61bc015197ccf840cdf8cd0dbfad10"} Jan 30 21:39:46 crc kubenswrapper[4834]: I0130 21:39:46.251576 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://21c7fda79770720423c94ff154c9288f7e61bc015197ccf840cdf8cd0dbfad10" gracePeriod=30 Jan 30 21:39:46 crc kubenswrapper[4834]: I0130 21:39:46.254541 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerStarted","Data":"554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816"} Jan 30 21:39:46 crc kubenswrapper[4834]: I0130 21:39:46.261000 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"039162bb-a262-4029-94ba-5682dc5a1b47","Type":"ContainerStarted","Data":"1f047ed2c95e826f67cc83dd744ea2be2e72347f4158ac63802c2d6208649f50"} Jan 30 21:39:46 crc kubenswrapper[4834]: I0130 21:39:46.289811 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.739046714 podStartE2EDuration="11.289789155s" podCreationTimestamp="2026-01-30 21:39:35 +0000 UTC" firstStartedPulling="2026-01-30 21:39:36.280029423 +0000 UTC m=+1427.433175561" lastFinishedPulling="2026-01-30 21:39:44.830771864 +0000 UTC m=+1435.983918002" observedRunningTime="2026-01-30 21:39:46.270438428 +0000 UTC m=+1437.423584606" watchObservedRunningTime="2026-01-30 21:39:46.289789155 +0000 UTC m=+1437.442935313" Jan 30 21:39:46 crc kubenswrapper[4834]: I0130 21:39:46.305801 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vn2x" podStartSLOduration=6.417378843 podStartE2EDuration="17.305773566s" podCreationTimestamp="2026-01-30 21:39:29 +0000 UTC" firstStartedPulling="2026-01-30 21:39:31.005923069 +0000 UTC m=+1422.159069217" lastFinishedPulling="2026-01-30 21:39:41.894317802 +0000 UTC m=+1433.047463940" observedRunningTime="2026-01-30 21:39:46.290459694 +0000 UTC m=+1437.443605842" watchObservedRunningTime="2026-01-30 21:39:46.305773566 +0000 UTC m=+1437.458919724" Jan 30 21:39:47 crc kubenswrapper[4834]: I0130 21:39:47.273898 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"039162bb-a262-4029-94ba-5682dc5a1b47","Type":"ContainerStarted","Data":"c74453b4cf64bd5397414e89b4691ed7cad673fd5bb5fc6cabf781a4b968cdca"} Jan 30 21:39:47 crc kubenswrapper[4834]: I0130 21:39:47.306795 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.9823114520000003 podStartE2EDuration="12.306779903s" podCreationTimestamp="2026-01-30 21:39:35 +0000 UTC" firstStartedPulling="2026-01-30 21:39:36.507958619 +0000 UTC m=+1427.661104757" lastFinishedPulling="2026-01-30 21:39:44.83242707 +0000 UTC m=+1435.985573208" observedRunningTime="2026-01-30 21:39:47.303233883 +0000 UTC m=+1438.456380011" watchObservedRunningTime="2026-01-30 21:39:47.306779903 +0000 UTC m=+1438.459926031" Jan 30 21:39:48 crc kubenswrapper[4834]: I0130 21:39:48.287309 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"023fb86d-a292-4a95-9411-a008f9e7e934","Type":"ContainerStarted","Data":"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1"} Jan 30 21:39:48 crc kubenswrapper[4834]: I0130 21:39:48.289693 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5678883b-9274-4209-9214-ddfac32200a5","Type":"ContainerStarted","Data":"f7d7f7037ec2e3891672111a2b5e2eb0a3b7a527a6624a2b2446008fa234847e"} Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.302352 4834 generic.go:334] "Generic (PLEG): container finished" podID="9628df07-52b2-4e8b-9b6b-a1964c59ea27" containerID="cdd3c241810cf3af99121cf9e569a6eeb1e998a40b2ce8c5e35d839e5362a59c" exitCode=0 Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.302423 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vq77c" event={"ID":"9628df07-52b2-4e8b-9b6b-a1964c59ea27","Type":"ContainerDied","Data":"cdd3c241810cf3af99121cf9e569a6eeb1e998a40b2ce8c5e35d839e5362a59c"} Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.305743 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-log" containerID="cri-o://813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1" gracePeriod=30 Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.305942 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-metadata" containerID="cri-o://d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a" gracePeriod=30 Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.306292 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"023fb86d-a292-4a95-9411-a008f9e7e934","Type":"ContainerStarted","Data":"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a"} Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.361574 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.120345069 podStartE2EDuration="14.361555487s" podCreationTimestamp="2026-01-30 21:39:35 +0000 UTC" firstStartedPulling="2026-01-30 21:39:36.543652887 +0000 UTC m=+1427.696799025" lastFinishedPulling="2026-01-30 21:39:46.784863305 +0000 UTC m=+1437.938009443" observedRunningTime="2026-01-30 21:39:49.356786803 +0000 UTC m=+1440.509932941" watchObservedRunningTime="2026-01-30 21:39:49.361555487 +0000 UTC m=+1440.514701625" Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.383656 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.267431643 podStartE2EDuration="14.383635341s" podCreationTimestamp="2026-01-30 21:39:35 +0000 UTC" firstStartedPulling="2026-01-30 21:39:36.670846909 +0000 UTC m=+1427.823993047" lastFinishedPulling="2026-01-30 21:39:46.787050607 +0000 UTC m=+1437.940196745" observedRunningTime="2026-01-30 21:39:49.371992762 +0000 UTC m=+1440.525138900" watchObservedRunningTime="2026-01-30 21:39:49.383635341 +0000 UTC m=+1440.536781479" Jan 30 21:39:49 crc kubenswrapper[4834]: I0130 21:39:49.986122 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.147176 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9856\" (UniqueName: \"kubernetes.io/projected/023fb86d-a292-4a95-9411-a008f9e7e934-kube-api-access-t9856\") pod \"023fb86d-a292-4a95-9411-a008f9e7e934\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.147344 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-config-data\") pod \"023fb86d-a292-4a95-9411-a008f9e7e934\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.147463 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-combined-ca-bundle\") pod \"023fb86d-a292-4a95-9411-a008f9e7e934\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.147819 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/023fb86d-a292-4a95-9411-a008f9e7e934-logs\") pod \"023fb86d-a292-4a95-9411-a008f9e7e934\" (UID: \"023fb86d-a292-4a95-9411-a008f9e7e934\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.148036 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/023fb86d-a292-4a95-9411-a008f9e7e934-logs" (OuterVolumeSpecName: "logs") pod "023fb86d-a292-4a95-9411-a008f9e7e934" (UID: "023fb86d-a292-4a95-9411-a008f9e7e934"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.148786 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/023fb86d-a292-4a95-9411-a008f9e7e934-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.154333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023fb86d-a292-4a95-9411-a008f9e7e934-kube-api-access-t9856" (OuterVolumeSpecName: "kube-api-access-t9856") pod "023fb86d-a292-4a95-9411-a008f9e7e934" (UID: "023fb86d-a292-4a95-9411-a008f9e7e934"). InnerVolumeSpecName "kube-api-access-t9856". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.181080 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.182671 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.186523 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "023fb86d-a292-4a95-9411-a008f9e7e934" (UID: "023fb86d-a292-4a95-9411-a008f9e7e934"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.187301 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-config-data" (OuterVolumeSpecName: "config-data") pod "023fb86d-a292-4a95-9411-a008f9e7e934" (UID: "023fb86d-a292-4a95-9411-a008f9e7e934"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.250550 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9856\" (UniqueName: \"kubernetes.io/projected/023fb86d-a292-4a95-9411-a008f9e7e934-kube-api-access-t9856\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.250585 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.250594 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/023fb86d-a292-4a95-9411-a008f9e7e934-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.320770 4834 generic.go:334] "Generic (PLEG): container finished" podID="023fb86d-a292-4a95-9411-a008f9e7e934" containerID="d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a" exitCode=0 Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.320806 4834 generic.go:334] "Generic (PLEG): container finished" podID="023fb86d-a292-4a95-9411-a008f9e7e934" containerID="813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1" exitCode=143 Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.321034 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.322092 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"023fb86d-a292-4a95-9411-a008f9e7e934","Type":"ContainerDied","Data":"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a"} Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.322142 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"023fb86d-a292-4a95-9411-a008f9e7e934","Type":"ContainerDied","Data":"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1"} Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.322158 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"023fb86d-a292-4a95-9411-a008f9e7e934","Type":"ContainerDied","Data":"9d593846b9b3e55355991d8e0488ae9467ac2959a1345caf3742adc36e70df64"} Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.322179 4834 scope.go:117] "RemoveContainer" containerID="d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.369230 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.373195 4834 scope.go:117] "RemoveContainer" containerID="813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.388965 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.401689 4834 scope.go:117] "RemoveContainer" containerID="d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.402475 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:50 crc kubenswrapper[4834]: E0130 21:39:50.402982 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-metadata" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.402997 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-metadata" Jan 30 21:39:50 crc kubenswrapper[4834]: E0130 21:39:50.403008 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-log" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.403018 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-log" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.404102 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-log" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.404123 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" containerName="nova-metadata-metadata" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.405738 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: E0130 21:39:50.410515 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a\": container with ID starting with d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a not found: ID does not exist" containerID="d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.410560 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a"} err="failed to get container status \"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a\": rpc error: code = NotFound desc = could not find container \"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a\": container with ID starting with d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a not found: ID does not exist" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.410587 4834 scope.go:117] "RemoveContainer" containerID="813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1" Jan 30 21:39:50 crc kubenswrapper[4834]: E0130 21:39:50.410970 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1\": container with ID starting with 813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1 not found: ID does not exist" containerID="813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.410993 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1"} err="failed to get container status \"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1\": rpc error: code = NotFound desc = could not find container \"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1\": container with ID starting with 813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1 not found: ID does not exist" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.411006 4834 scope.go:117] "RemoveContainer" containerID="d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.411251 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a"} err="failed to get container status \"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a\": rpc error: code = NotFound desc = could not find container \"d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a\": container with ID starting with d9cf1c3ee5dbeed786d3b67450c4a6f13fbbc6460c5adcf7fbd8329bc604642a not found: ID does not exist" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.411291 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.411295 4834 scope.go:117] "RemoveContainer" containerID="813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.411446 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.417989 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.459796 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1"} err="failed to get container status \"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1\": rpc error: code = NotFound desc = could not find container \"813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1\": container with ID starting with 813093f8511dacdfcfdc3b0eebdfee8fd7aaca71323d948f10804cdf5610b7a1 not found: ID does not exist" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.561069 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7342471-1e91-4e79-863b-b7a8bb267083-logs\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.561117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-config-data\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.561179 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.561213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.561237 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vrz\" (UniqueName: \"kubernetes.io/projected/e7342471-1e91-4e79-863b-b7a8bb267083-kube-api-access-r8vrz\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.665725 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.665728 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.665922 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.665983 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vrz\" (UniqueName: \"kubernetes.io/projected/e7342471-1e91-4e79-863b-b7a8bb267083-kube-api-access-r8vrz\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.666318 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7342471-1e91-4e79-863b-b7a8bb267083-logs\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.666411 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-config-data\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.667292 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7342471-1e91-4e79-863b-b7a8bb267083-logs\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.669877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.670239 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-config-data\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.671342 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.685761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vrz\" (UniqueName: \"kubernetes.io/projected/e7342471-1e91-4e79-863b-b7a8bb267083-kube-api-access-r8vrz\") pod \"nova-metadata-0\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.780443 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.780660 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.838189 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.869536 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-combined-ca-bundle\") pod \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.870348 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-config-data\") pod \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.870452 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-scripts\") pod \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.870487 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdth9\" (UniqueName: \"kubernetes.io/projected/9628df07-52b2-4e8b-9b6b-a1964c59ea27-kube-api-access-wdth9\") pod \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\" (UID: \"9628df07-52b2-4e8b-9b6b-a1964c59ea27\") " Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.879305 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-scripts" (OuterVolumeSpecName: "scripts") pod "9628df07-52b2-4e8b-9b6b-a1964c59ea27" (UID: "9628df07-52b2-4e8b-9b6b-a1964c59ea27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.879368 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9628df07-52b2-4e8b-9b6b-a1964c59ea27-kube-api-access-wdth9" (OuterVolumeSpecName: "kube-api-access-wdth9") pod "9628df07-52b2-4e8b-9b6b-a1964c59ea27" (UID: "9628df07-52b2-4e8b-9b6b-a1964c59ea27"). InnerVolumeSpecName "kube-api-access-wdth9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.912962 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-config-data" (OuterVolumeSpecName: "config-data") pod "9628df07-52b2-4e8b-9b6b-a1964c59ea27" (UID: "9628df07-52b2-4e8b-9b6b-a1964c59ea27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.913009 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9628df07-52b2-4e8b-9b6b-a1964c59ea27" (UID: "9628df07-52b2-4e8b-9b6b-a1964c59ea27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.973461 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.973491 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.973500 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9628df07-52b2-4e8b-9b6b-a1964c59ea27-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:50 crc kubenswrapper[4834]: I0130 21:39:50.973508 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdth9\" (UniqueName: \"kubernetes.io/projected/9628df07-52b2-4e8b-9b6b-a1964c59ea27-kube-api-access-wdth9\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.004518 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.081158 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7p2sv"] Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.084926 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="dnsmasq-dns" containerID="cri-o://a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0" gracePeriod=10 Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.236685 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9vn2x" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" probeResult="failure" output=< Jan 30 21:39:51 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:39:51 crc kubenswrapper[4834]: > Jan 30 21:39:51 crc kubenswrapper[4834]: W0130 21:39:51.323703 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7342471_1e91_4e79_863b_b7a8bb267083.slice/crio-72b35d4bea6f9a77d88b0c002823f14ae297a55b89cdedd3c66c80916f3101c5 WatchSource:0}: Error finding container 72b35d4bea6f9a77d88b0c002823f14ae297a55b89cdedd3c66c80916f3101c5: Status 404 returned error can't find the container with id 72b35d4bea6f9a77d88b0c002823f14ae297a55b89cdedd3c66c80916f3101c5 Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.325165 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.354201 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vq77c" event={"ID":"9628df07-52b2-4e8b-9b6b-a1964c59ea27","Type":"ContainerDied","Data":"7f41a115d4eb5f7a02ca0d140868723c9549177ebfbc339f957c589dc9807a1a"} Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.354240 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f41a115d4eb5f7a02ca0d140868723c9549177ebfbc339f957c589dc9807a1a" Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.354280 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vq77c" Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.543029 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023fb86d-a292-4a95-9411-a008f9e7e934" path="/var/lib/kubelet/pods/023fb86d-a292-4a95-9411-a008f9e7e934/volumes" Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.633100 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.633348 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5678883b-9274-4209-9214-ddfac32200a5" containerName="nova-scheduler-scheduler" containerID="cri-o://f7d7f7037ec2e3891672111a2b5e2eb0a3b7a527a6624a2b2446008fa234847e" gracePeriod=30 Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.647642 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.647892 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-log" containerID="cri-o://1f047ed2c95e826f67cc83dd744ea2be2e72347f4158ac63802c2d6208649f50" gracePeriod=30 Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.648041 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-api" containerID="cri-o://c74453b4cf64bd5397414e89b4691ed7cad673fd5bb5fc6cabf781a4b968cdca" gracePeriod=30 Jan 30 21:39:51 crc kubenswrapper[4834]: I0130 21:39:51.657581 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.247718 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.364355 4834 generic.go:334] "Generic (PLEG): container finished" podID="5678883b-9274-4209-9214-ddfac32200a5" containerID="f7d7f7037ec2e3891672111a2b5e2eb0a3b7a527a6624a2b2446008fa234847e" exitCode=0 Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.364532 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5678883b-9274-4209-9214-ddfac32200a5","Type":"ContainerDied","Data":"f7d7f7037ec2e3891672111a2b5e2eb0a3b7a527a6624a2b2446008fa234847e"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.367803 4834 generic.go:334] "Generic (PLEG): container finished" podID="039162bb-a262-4029-94ba-5682dc5a1b47" containerID="c74453b4cf64bd5397414e89b4691ed7cad673fd5bb5fc6cabf781a4b968cdca" exitCode=0 Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.367836 4834 generic.go:334] "Generic (PLEG): container finished" podID="039162bb-a262-4029-94ba-5682dc5a1b47" containerID="1f047ed2c95e826f67cc83dd744ea2be2e72347f4158ac63802c2d6208649f50" exitCode=143 Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.367876 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"039162bb-a262-4029-94ba-5682dc5a1b47","Type":"ContainerDied","Data":"c74453b4cf64bd5397414e89b4691ed7cad673fd5bb5fc6cabf781a4b968cdca"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.367914 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"039162bb-a262-4029-94ba-5682dc5a1b47","Type":"ContainerDied","Data":"1f047ed2c95e826f67cc83dd744ea2be2e72347f4158ac63802c2d6208649f50"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.369898 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7342471-1e91-4e79-863b-b7a8bb267083","Type":"ContainerStarted","Data":"385e15e77d0e31c14436baf593385a763b31b2e8e1be01c8af16cf387a2b6ddc"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.369925 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7342471-1e91-4e79-863b-b7a8bb267083","Type":"ContainerStarted","Data":"f7f732fa0573234450d9eea22282c6582f3e2be7f2958288accf79bf9645b4d6"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.369935 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7342471-1e91-4e79-863b-b7a8bb267083","Type":"ContainerStarted","Data":"72b35d4bea6f9a77d88b0c002823f14ae297a55b89cdedd3c66c80916f3101c5"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.371624 4834 generic.go:334] "Generic (PLEG): container finished" podID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerID="a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0" exitCode=0 Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.371658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" event={"ID":"f94b59b7-cd95-45e4-ac51-213483a8cd62","Type":"ContainerDied","Data":"a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.371690 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" event={"ID":"f94b59b7-cd95-45e4-ac51-213483a8cd62","Type":"ContainerDied","Data":"11dabcceb331518922d924ac4dfc049e0f2376559c758edbae3e23b54cba2851"} Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.371713 4834 scope.go:117] "RemoveContainer" containerID="a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.371898 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.401593 4834 scope.go:117] "RemoveContainer" containerID="75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.407869 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-sb\") pod \"f94b59b7-cd95-45e4-ac51-213483a8cd62\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.407935 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-config\") pod \"f94b59b7-cd95-45e4-ac51-213483a8cd62\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.407954 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-svc\") pod \"f94b59b7-cd95-45e4-ac51-213483a8cd62\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.408000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-nb\") pod \"f94b59b7-cd95-45e4-ac51-213483a8cd62\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.408036 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-swift-storage-0\") pod \"f94b59b7-cd95-45e4-ac51-213483a8cd62\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.408063 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggct2\" (UniqueName: \"kubernetes.io/projected/f94b59b7-cd95-45e4-ac51-213483a8cd62-kube-api-access-ggct2\") pod \"f94b59b7-cd95-45e4-ac51-213483a8cd62\" (UID: \"f94b59b7-cd95-45e4-ac51-213483a8cd62\") " Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.413836 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94b59b7-cd95-45e4-ac51-213483a8cd62-kube-api-access-ggct2" (OuterVolumeSpecName: "kube-api-access-ggct2") pod "f94b59b7-cd95-45e4-ac51-213483a8cd62" (UID: "f94b59b7-cd95-45e4-ac51-213483a8cd62"). InnerVolumeSpecName "kube-api-access-ggct2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.444078 4834 scope.go:117] "RemoveContainer" containerID="a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0" Jan 30 21:39:52 crc kubenswrapper[4834]: E0130 21:39:52.444593 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0\": container with ID starting with a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0 not found: ID does not exist" containerID="a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.444646 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0"} err="failed to get container status \"a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0\": rpc error: code = NotFound desc = could not find container \"a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0\": container with ID starting with a2943aeb4b8c6f6c9a9ae7603a3c8a5d568381aba75a8cc0b5ab2fe7dd4caaf0 not found: ID does not exist" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.444675 4834 scope.go:117] "RemoveContainer" containerID="75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed" Jan 30 21:39:52 crc kubenswrapper[4834]: E0130 21:39:52.445129 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed\": container with ID starting with 75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed not found: ID does not exist" containerID="75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.445174 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed"} err="failed to get container status \"75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed\": rpc error: code = NotFound desc = could not find container \"75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed\": container with ID starting with 75d3db8b767b4ff8d8ce87fb540f4fab57c206720fff11a608fa00ea00ce84ed not found: ID does not exist" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.505664 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f94b59b7-cd95-45e4-ac51-213483a8cd62" (UID: "f94b59b7-cd95-45e4-ac51-213483a8cd62"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.505668 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f94b59b7-cd95-45e4-ac51-213483a8cd62" (UID: "f94b59b7-cd95-45e4-ac51-213483a8cd62"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.507041 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f94b59b7-cd95-45e4-ac51-213483a8cd62" (UID: "f94b59b7-cd95-45e4-ac51-213483a8cd62"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.515201 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggct2\" (UniqueName: \"kubernetes.io/projected/f94b59b7-cd95-45e4-ac51-213483a8cd62-kube-api-access-ggct2\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.515236 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.515282 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.515299 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.538817 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-config" (OuterVolumeSpecName: "config") pod "f94b59b7-cd95-45e4-ac51-213483a8cd62" (UID: "f94b59b7-cd95-45e4-ac51-213483a8cd62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.544919 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f94b59b7-cd95-45e4-ac51-213483a8cd62" (UID: "f94b59b7-cd95-45e4-ac51-213483a8cd62"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.617784 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.617858 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f94b59b7-cd95-45e4-ac51-213483a8cd62-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.710815 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7p2sv"] Jan 30 21:39:52 crc kubenswrapper[4834]: I0130 21:39:52.720770 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-7p2sv"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.019524 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.030277 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.128220 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-config-data\") pod \"5678883b-9274-4209-9214-ddfac32200a5\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.128301 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-combined-ca-bundle\") pod \"5678883b-9274-4209-9214-ddfac32200a5\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.128467 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcngr\" (UniqueName: \"kubernetes.io/projected/5678883b-9274-4209-9214-ddfac32200a5-kube-api-access-hcngr\") pod \"5678883b-9274-4209-9214-ddfac32200a5\" (UID: \"5678883b-9274-4209-9214-ddfac32200a5\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.134352 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5678883b-9274-4209-9214-ddfac32200a5-kube-api-access-hcngr" (OuterVolumeSpecName: "kube-api-access-hcngr") pod "5678883b-9274-4209-9214-ddfac32200a5" (UID: "5678883b-9274-4209-9214-ddfac32200a5"). InnerVolumeSpecName "kube-api-access-hcngr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.156967 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5678883b-9274-4209-9214-ddfac32200a5" (UID: "5678883b-9274-4209-9214-ddfac32200a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.162553 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-config-data" (OuterVolumeSpecName: "config-data") pod "5678883b-9274-4209-9214-ddfac32200a5" (UID: "5678883b-9274-4209-9214-ddfac32200a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.230606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq2bw\" (UniqueName: \"kubernetes.io/projected/039162bb-a262-4029-94ba-5682dc5a1b47-kube-api-access-hq2bw\") pod \"039162bb-a262-4029-94ba-5682dc5a1b47\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.230917 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/039162bb-a262-4029-94ba-5682dc5a1b47-logs\") pod \"039162bb-a262-4029-94ba-5682dc5a1b47\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.231128 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-combined-ca-bundle\") pod \"039162bb-a262-4029-94ba-5682dc5a1b47\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.231248 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/039162bb-a262-4029-94ba-5682dc5a1b47-logs" (OuterVolumeSpecName: "logs") pod "039162bb-a262-4029-94ba-5682dc5a1b47" (UID: "039162bb-a262-4029-94ba-5682dc5a1b47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.231448 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-config-data\") pod \"039162bb-a262-4029-94ba-5682dc5a1b47\" (UID: \"039162bb-a262-4029-94ba-5682dc5a1b47\") " Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.232259 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.232361 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5678883b-9274-4209-9214-ddfac32200a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.232487 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcngr\" (UniqueName: \"kubernetes.io/projected/5678883b-9274-4209-9214-ddfac32200a5-kube-api-access-hcngr\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.232575 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/039162bb-a262-4029-94ba-5682dc5a1b47-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.233675 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/039162bb-a262-4029-94ba-5682dc5a1b47-kube-api-access-hq2bw" (OuterVolumeSpecName: "kube-api-access-hq2bw") pod "039162bb-a262-4029-94ba-5682dc5a1b47" (UID: "039162bb-a262-4029-94ba-5682dc5a1b47"). InnerVolumeSpecName "kube-api-access-hq2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.259420 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-config-data" (OuterVolumeSpecName: "config-data") pod "039162bb-a262-4029-94ba-5682dc5a1b47" (UID: "039162bb-a262-4029-94ba-5682dc5a1b47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.282155 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "039162bb-a262-4029-94ba-5682dc5a1b47" (UID: "039162bb-a262-4029-94ba-5682dc5a1b47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.334521 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq2bw\" (UniqueName: \"kubernetes.io/projected/039162bb-a262-4029-94ba-5682dc5a1b47-kube-api-access-hq2bw\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.334555 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.334568 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039162bb-a262-4029-94ba-5682dc5a1b47-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.387618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"039162bb-a262-4029-94ba-5682dc5a1b47","Type":"ContainerDied","Data":"740b8ec3e8ca209dfbab7ca793d03a4c37a1f290209690f1c43292d0856a2ed4"} Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.388067 4834 scope.go:117] "RemoveContainer" containerID="c74453b4cf64bd5397414e89b4691ed7cad673fd5bb5fc6cabf781a4b968cdca" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.394619 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.396653 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-log" containerID="cri-o://f7f732fa0573234450d9eea22282c6582f3e2be7f2958288accf79bf9645b4d6" gracePeriod=30 Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.397221 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.399379 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5678883b-9274-4209-9214-ddfac32200a5","Type":"ContainerDied","Data":"292aa66feff7af98f45ae0c4b2789b09d22f2b14861cbf373db09a389120437a"} Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.399469 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-metadata" containerID="cri-o://385e15e77d0e31c14436baf593385a763b31b2e8e1be01c8af16cf387a2b6ddc" gracePeriod=30 Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.451766 4834 scope.go:117] "RemoveContainer" containerID="1f047ed2c95e826f67cc83dd744ea2be2e72347f4158ac63802c2d6208649f50" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.453367 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.453343314 podStartE2EDuration="3.453343314s" podCreationTimestamp="2026-01-30 21:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:53.431377374 +0000 UTC m=+1444.584523512" watchObservedRunningTime="2026-01-30 21:39:53.453343314 +0000 UTC m=+1444.606489452" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.465770 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.475021 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.488371 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.498734 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.509760 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.510239 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628df07-52b2-4e8b-9b6b-a1964c59ea27" containerName="nova-manage" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510258 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628df07-52b2-4e8b-9b6b-a1964c59ea27" containerName="nova-manage" Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.510274 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-api" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510282 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-api" Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.510307 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-log" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510313 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-log" Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.510324 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5678883b-9274-4209-9214-ddfac32200a5" containerName="nova-scheduler-scheduler" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510331 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5678883b-9274-4209-9214-ddfac32200a5" containerName="nova-scheduler-scheduler" Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.510340 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="init" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510345 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="init" Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.510358 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="dnsmasq-dns" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510364 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="dnsmasq-dns" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510554 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-log" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510574 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" containerName="nova-api-api" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510584 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="dnsmasq-dns" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510597 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5678883b-9274-4209-9214-ddfac32200a5" containerName="nova-scheduler-scheduler" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.510629 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9628df07-52b2-4e8b-9b6b-a1964c59ea27" containerName="nova-manage" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.511765 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.514960 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.518019 4834 scope.go:117] "RemoveContainer" containerID="f7d7f7037ec2e3891672111a2b5e2eb0a3b7a527a6624a2b2446008fa234847e" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.518122 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.519371 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.523032 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.555508 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="039162bb-a262-4029-94ba-5682dc5a1b47" path="/var/lib/kubelet/pods/039162bb-a262-4029-94ba-5682dc5a1b47/volumes" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.556226 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5678883b-9274-4209-9214-ddfac32200a5" path="/var/lib/kubelet/pods/5678883b-9274-4209-9214-ddfac32200a5/volumes" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.557363 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" path="/var/lib/kubelet/pods/f94b59b7-cd95-45e4-ac51-213483a8cd62/volumes" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.574353 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.587712 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.644955 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.645098 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0df633-b4a0-4147-b2ef-46982c4f59c7-logs\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.645118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxvmv\" (UniqueName: \"kubernetes.io/projected/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-kube-api-access-rxvmv\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.645154 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m2vh\" (UniqueName: \"kubernetes.io/projected/6a0df633-b4a0-4147-b2ef-46982c4f59c7-kube-api-access-7m2vh\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.645191 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-config-data\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.645233 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.645278 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-config-data\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: E0130 21:39:53.714550 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7342471_1e91_4e79_863b_b7a8bb267083.slice/crio-conmon-f7f732fa0573234450d9eea22282c6582f3e2be7f2958288accf79bf9645b4d6.scope\": RecentStats: unable to find data in memory cache]" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.746874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.746943 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0df633-b4a0-4147-b2ef-46982c4f59c7-logs\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.746965 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxvmv\" (UniqueName: \"kubernetes.io/projected/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-kube-api-access-rxvmv\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.746991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m2vh\" (UniqueName: \"kubernetes.io/projected/6a0df633-b4a0-4147-b2ef-46982c4f59c7-kube-api-access-7m2vh\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.747020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-config-data\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.747053 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.747084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-config-data\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.747489 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0df633-b4a0-4147-b2ef-46982c4f59c7-logs\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.751237 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-config-data\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.751287 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.752128 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.754810 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-config-data\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.764907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxvmv\" (UniqueName: \"kubernetes.io/projected/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-kube-api-access-rxvmv\") pod \"nova-scheduler-0\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " pod="openstack/nova-scheduler-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.765139 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m2vh\" (UniqueName: \"kubernetes.io/projected/6a0df633-b4a0-4147-b2ef-46982c4f59c7-kube-api-access-7m2vh\") pod \"nova-api-0\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.887856 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:39:53 crc kubenswrapper[4834]: I0130 21:39:53.894315 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.416701 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7342471-1e91-4e79-863b-b7a8bb267083" containerID="385e15e77d0e31c14436baf593385a763b31b2e8e1be01c8af16cf387a2b6ddc" exitCode=0 Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.417422 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7342471-1e91-4e79-863b-b7a8bb267083" containerID="f7f732fa0573234450d9eea22282c6582f3e2be7f2958288accf79bf9645b4d6" exitCode=143 Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.416804 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7342471-1e91-4e79-863b-b7a8bb267083","Type":"ContainerDied","Data":"385e15e77d0e31c14436baf593385a763b31b2e8e1be01c8af16cf387a2b6ddc"} Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.417474 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7342471-1e91-4e79-863b-b7a8bb267083","Type":"ContainerDied","Data":"f7f732fa0573234450d9eea22282c6582f3e2be7f2958288accf79bf9645b4d6"} Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.543205 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:39:54 crc kubenswrapper[4834]: W0130 21:39:54.548228 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05abe6f8_7cb5_4f41_aa26_cb06d63e3c79.slice/crio-0ed12a1f5bfe81c6b31caaa69ec05400b0508b480023ef69ccf551348457fffa WatchSource:0}: Error finding container 0ed12a1f5bfe81c6b31caaa69ec05400b0508b480023ef69ccf551348457fffa: Status 404 returned error can't find the container with id 0ed12a1f5bfe81c6b31caaa69ec05400b0508b480023ef69ccf551348457fffa Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.632510 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:39:54 crc kubenswrapper[4834]: W0130 21:39:54.655189 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a0df633_b4a0_4147_b2ef_46982c4f59c7.slice/crio-5111edbe5a6fb4d612b02074787eb94c3c8be5aad85336271640032dc4b69f78 WatchSource:0}: Error finding container 5111edbe5a6fb4d612b02074787eb94c3c8be5aad85336271640032dc4b69f78: Status 404 returned error can't find the container with id 5111edbe5a6fb4d612b02074787eb94c3c8be5aad85336271640032dc4b69f78 Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.816016 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.987221 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-combined-ca-bundle\") pod \"e7342471-1e91-4e79-863b-b7a8bb267083\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.987327 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-config-data\") pod \"e7342471-1e91-4e79-863b-b7a8bb267083\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.987419 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8vrz\" (UniqueName: \"kubernetes.io/projected/e7342471-1e91-4e79-863b-b7a8bb267083-kube-api-access-r8vrz\") pod \"e7342471-1e91-4e79-863b-b7a8bb267083\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.987542 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-nova-metadata-tls-certs\") pod \"e7342471-1e91-4e79-863b-b7a8bb267083\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.987586 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7342471-1e91-4e79-863b-b7a8bb267083-logs\") pod \"e7342471-1e91-4e79-863b-b7a8bb267083\" (UID: \"e7342471-1e91-4e79-863b-b7a8bb267083\") " Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.988006 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7342471-1e91-4e79-863b-b7a8bb267083-logs" (OuterVolumeSpecName: "logs") pod "e7342471-1e91-4e79-863b-b7a8bb267083" (UID: "e7342471-1e91-4e79-863b-b7a8bb267083"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.988593 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7342471-1e91-4e79-863b-b7a8bb267083-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:54 crc kubenswrapper[4834]: I0130 21:39:54.992376 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7342471-1e91-4e79-863b-b7a8bb267083-kube-api-access-r8vrz" (OuterVolumeSpecName: "kube-api-access-r8vrz") pod "e7342471-1e91-4e79-863b-b7a8bb267083" (UID: "e7342471-1e91-4e79-863b-b7a8bb267083"). InnerVolumeSpecName "kube-api-access-r8vrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.014363 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-config-data" (OuterVolumeSpecName: "config-data") pod "e7342471-1e91-4e79-863b-b7a8bb267083" (UID: "e7342471-1e91-4e79-863b-b7a8bb267083"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.020259 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7342471-1e91-4e79-863b-b7a8bb267083" (UID: "e7342471-1e91-4e79-863b-b7a8bb267083"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.063590 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e7342471-1e91-4e79-863b-b7a8bb267083" (UID: "e7342471-1e91-4e79-863b-b7a8bb267083"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.090700 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.090754 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8vrz\" (UniqueName: \"kubernetes.io/projected/e7342471-1e91-4e79-863b-b7a8bb267083-kube-api-access-r8vrz\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.090769 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.090781 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7342471-1e91-4e79-863b-b7a8bb267083-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.427743 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a0df633-b4a0-4147-b2ef-46982c4f59c7","Type":"ContainerStarted","Data":"a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746"} Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.427786 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a0df633-b4a0-4147-b2ef-46982c4f59c7","Type":"ContainerStarted","Data":"5111edbe5a6fb4d612b02074787eb94c3c8be5aad85336271640032dc4b69f78"} Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.429196 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e7342471-1e91-4e79-863b-b7a8bb267083","Type":"ContainerDied","Data":"72b35d4bea6f9a77d88b0c002823f14ae297a55b89cdedd3c66c80916f3101c5"} Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.429225 4834 scope.go:117] "RemoveContainer" containerID="385e15e77d0e31c14436baf593385a763b31b2e8e1be01c8af16cf387a2b6ddc" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.429329 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.435747 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79","Type":"ContainerStarted","Data":"11c86a1a9b21d79c27f67a07979fefd7141f01841f1eb423d4ca68a4e4af6252"} Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.435792 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79","Type":"ContainerStarted","Data":"0ed12a1f5bfe81c6b31caaa69ec05400b0508b480023ef69ccf551348457fffa"} Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.467264 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.467236953 podStartE2EDuration="2.467236953s" podCreationTimestamp="2026-01-30 21:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:55.459715361 +0000 UTC m=+1446.612861499" watchObservedRunningTime="2026-01-30 21:39:55.467236953 +0000 UTC m=+1446.620383121" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.536703 4834 scope.go:117] "RemoveContainer" containerID="f7f732fa0573234450d9eea22282c6582f3e2be7f2958288accf79bf9645b4d6" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.555121 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.568630 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.575936 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:55 crc kubenswrapper[4834]: E0130 21:39:55.576704 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-metadata" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.576722 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-metadata" Jan 30 21:39:55 crc kubenswrapper[4834]: E0130 21:39:55.576749 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-log" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.576757 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-log" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.576938 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-log" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.576957 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" containerName="nova-metadata-metadata" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.578061 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.581892 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.582456 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.589323 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.703834 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85572\" (UniqueName: \"kubernetes.io/projected/e3dec121-36ef-4d13-bd27-7a691e2a89c2-kube-api-access-85572\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.704125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.704285 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.705322 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dec121-36ef-4d13-bd27-7a691e2a89c2-logs\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.705527 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-config-data\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.809151 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85572\" (UniqueName: \"kubernetes.io/projected/e3dec121-36ef-4d13-bd27-7a691e2a89c2-kube-api-access-85572\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.809310 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.809363 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.809929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dec121-36ef-4d13-bd27-7a691e2a89c2-logs\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.810129 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-config-data\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.810639 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dec121-36ef-4d13-bd27-7a691e2a89c2-logs\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.818584 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-config-data\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.830859 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.833081 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.844711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85572\" (UniqueName: \"kubernetes.io/projected/e3dec121-36ef-4d13-bd27-7a691e2a89c2-kube-api-access-85572\") pod \"nova-metadata-0\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " pod="openstack/nova-metadata-0" Jan 30 21:39:55 crc kubenswrapper[4834]: I0130 21:39:55.904432 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:39:56 crc kubenswrapper[4834]: I0130 21:39:56.429384 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:39:56 crc kubenswrapper[4834]: I0130 21:39:56.445202 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3dec121-36ef-4d13-bd27-7a691e2a89c2","Type":"ContainerStarted","Data":"7027bf338076c899a9424e5e5699d91b3be19155ff998046ba5d88fc9f2c6a35"} Jan 30 21:39:56 crc kubenswrapper[4834]: I0130 21:39:56.446795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a0df633-b4a0-4147-b2ef-46982c4f59c7","Type":"ContainerStarted","Data":"e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8"} Jan 30 21:39:56 crc kubenswrapper[4834]: I0130 21:39:56.478041 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.478015897 podStartE2EDuration="3.478015897s" podCreationTimestamp="2026-01-30 21:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:56.473908931 +0000 UTC m=+1447.627055079" watchObservedRunningTime="2026-01-30 21:39:56.478015897 +0000 UTC m=+1447.631162045" Jan 30 21:39:57 crc kubenswrapper[4834]: I0130 21:39:57.087343 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4fc677f-7p2sv" podUID="f94b59b7-cd95-45e4-ac51-213483a8cd62" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Jan 30 21:39:57 crc kubenswrapper[4834]: I0130 21:39:57.461995 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3dec121-36ef-4d13-bd27-7a691e2a89c2","Type":"ContainerStarted","Data":"a818a9d1e9689de9092288846ee127a0de243ae4a6f954783929b829e2320d6a"} Jan 30 21:39:57 crc kubenswrapper[4834]: I0130 21:39:57.462330 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3dec121-36ef-4d13-bd27-7a691e2a89c2","Type":"ContainerStarted","Data":"97aec49a7aa3d24dd34f1df1e46188d3ef201a34cb547fd224d20e3dc3b9c414"} Jan 30 21:39:57 crc kubenswrapper[4834]: I0130 21:39:57.545006 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7342471-1e91-4e79-863b-b7a8bb267083" path="/var/lib/kubelet/pods/e7342471-1e91-4e79-863b-b7a8bb267083/volumes" Jan 30 21:39:58 crc kubenswrapper[4834]: I0130 21:39:58.509387 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.509365719 podStartE2EDuration="3.509365719s" podCreationTimestamp="2026-01-30 21:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:39:58.502349991 +0000 UTC m=+1449.655496129" watchObservedRunningTime="2026-01-30 21:39:58.509365719 +0000 UTC m=+1449.662511867" Jan 30 21:39:58 crc kubenswrapper[4834]: I0130 21:39:58.895041 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 21:40:00 crc kubenswrapper[4834]: I0130 21:40:00.905094 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:40:00 crc kubenswrapper[4834]: I0130 21:40:00.905505 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:40:01 crc kubenswrapper[4834]: I0130 21:40:01.234471 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9vn2x" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" probeResult="failure" output=< Jan 30 21:40:01 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:40:01 crc kubenswrapper[4834]: > Jan 30 21:40:03 crc kubenswrapper[4834]: I0130 21:40:03.888160 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:40:03 crc kubenswrapper[4834]: I0130 21:40:03.888571 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:40:03 crc kubenswrapper[4834]: I0130 21:40:03.894616 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 21:40:03 crc kubenswrapper[4834]: I0130 21:40:03.943561 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 21:40:04 crc kubenswrapper[4834]: I0130 21:40:04.576520 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 21:40:04 crc kubenswrapper[4834]: I0130 21:40:04.970635 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:40:04 crc kubenswrapper[4834]: I0130 21:40:04.970678 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:40:05 crc kubenswrapper[4834]: I0130 21:40:05.905679 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:40:05 crc kubenswrapper[4834]: I0130 21:40:05.905765 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:40:06 crc kubenswrapper[4834]: I0130 21:40:06.922513 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:40:06 crc kubenswrapper[4834]: I0130 21:40:06.922533 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:40:11 crc kubenswrapper[4834]: I0130 21:40:11.222593 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9vn2x" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" probeResult="failure" output=< Jan 30 21:40:11 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:40:11 crc kubenswrapper[4834]: > Jan 30 21:40:13 crc kubenswrapper[4834]: I0130 21:40:13.891350 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:40:13 crc kubenswrapper[4834]: I0130 21:40:13.892824 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:40:13 crc kubenswrapper[4834]: I0130 21:40:13.892965 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:40:13 crc kubenswrapper[4834]: I0130 21:40:13.898540 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:40:14 crc kubenswrapper[4834]: I0130 21:40:14.666204 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:40:14 crc kubenswrapper[4834]: I0130 21:40:14.671731 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:40:14 crc kubenswrapper[4834]: I0130 21:40:14.913171 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-t2r4g"] Jan 30 21:40:14 crc kubenswrapper[4834]: I0130 21:40:14.915612 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.021324 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-t2r4g"] Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.037506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.037772 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.037793 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.037825 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-config\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.037879 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.037912 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvnd\" (UniqueName: \"kubernetes.io/projected/89a4de72-8a0e-4fac-a03e-d01ed420df81-kube-api-access-msvnd\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.140835 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.140935 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.140966 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.141002 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-config\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.141070 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.141111 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvnd\" (UniqueName: \"kubernetes.io/projected/89a4de72-8a0e-4fac-a03e-d01ed420df81-kube-api-access-msvnd\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.141700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.142192 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.142312 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.146317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.146635 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-config\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.179147 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvnd\" (UniqueName: \"kubernetes.io/projected/89a4de72-8a0e-4fac-a03e-d01ed420df81-kube-api-access-msvnd\") pod \"dnsmasq-dns-5c7b6c5df9-t2r4g\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.238514 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.743262 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-t2r4g"] Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.912659 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.931820 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:40:15 crc kubenswrapper[4834]: I0130 21:40:15.933536 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:40:16 crc kubenswrapper[4834]: I0130 21:40:16.683527 4834 generic.go:334] "Generic (PLEG): container finished" podID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerID="ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9" exitCode=0 Jan 30 21:40:16 crc kubenswrapper[4834]: I0130 21:40:16.683641 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" event={"ID":"89a4de72-8a0e-4fac-a03e-d01ed420df81","Type":"ContainerDied","Data":"ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9"} Jan 30 21:40:16 crc kubenswrapper[4834]: I0130 21:40:16.683870 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" event={"ID":"89a4de72-8a0e-4fac-a03e-d01ed420df81","Type":"ContainerStarted","Data":"9849eee6d9b839afc52f61b638779f9cf814e887daa2624043ca4e0be0c9a6cf"} Jan 30 21:40:16 crc kubenswrapper[4834]: I0130 21:40:16.686272 4834 generic.go:334] "Generic (PLEG): container finished" podID="4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" containerID="21c7fda79770720423c94ff154c9288f7e61bc015197ccf840cdf8cd0dbfad10" exitCode=137 Jan 30 21:40:16 crc kubenswrapper[4834]: I0130 21:40:16.686762 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4","Type":"ContainerDied","Data":"21c7fda79770720423c94ff154c9288f7e61bc015197ccf840cdf8cd0dbfad10"} Jan 30 21:40:16 crc kubenswrapper[4834]: I0130 21:40:16.694052 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.451929 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.452774 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-central-agent" containerID="cri-o://7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1" gracePeriod=30 Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.452830 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="sg-core" containerID="cri-o://8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496" gracePeriod=30 Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.452831 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="proxy-httpd" containerID="cri-o://5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f" gracePeriod=30 Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.452882 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-notification-agent" containerID="cri-o://6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5" gracePeriod=30 Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.467822 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.612232 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-combined-ca-bundle\") pod \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.612407 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-config-data\") pod \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.612561 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52w5h\" (UniqueName: \"kubernetes.io/projected/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-kube-api-access-52w5h\") pod \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\" (UID: \"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4\") " Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.646297 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-kube-api-access-52w5h" (OuterVolumeSpecName: "kube-api-access-52w5h") pod "4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" (UID: "4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4"). InnerVolumeSpecName "kube-api-access-52w5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.718945 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52w5h\" (UniqueName: \"kubernetes.io/projected/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-kube-api-access-52w5h\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.727621 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.739454 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4","Type":"ContainerDied","Data":"9466372a2b51d644eb075d4728225949fa9a03935fa926b94816372c2b1da0d1"} Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.739517 4834 scope.go:117] "RemoveContainer" containerID="21c7fda79770720423c94ff154c9288f7e61bc015197ccf840cdf8cd0dbfad10" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.739691 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.745529 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-config-data" (OuterVolumeSpecName: "config-data") pod "4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" (UID: "4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.745894 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" (UID: "4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.763517 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-log" containerID="cri-o://a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746" gracePeriod=30 Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.763858 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-api" containerID="cri-o://e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8" gracePeriod=30 Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.764045 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" event={"ID":"89a4de72-8a0e-4fac-a03e-d01ed420df81","Type":"ContainerStarted","Data":"7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4"} Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.771555 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.808079 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" podStartSLOduration=3.8080607090000003 podStartE2EDuration="3.808060709s" podCreationTimestamp="2026-01-30 21:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:17.80599056 +0000 UTC m=+1468.959136708" watchObservedRunningTime="2026-01-30 21:40:17.808060709 +0000 UTC m=+1468.961206847" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.823591 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:17 crc kubenswrapper[4834]: I0130 21:40:17.823732 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.075037 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.089733 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.105074 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:40:18 crc kubenswrapper[4834]: E0130 21:40:18.105608 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.105632 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.105907 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" containerName="nova-cell1-novncproxy-novncproxy" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.107100 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.176276 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.176683 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.176982 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.180168 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvmh\" (UniqueName: \"kubernetes.io/projected/255a980b-28cb-4fe1-a9b7-3b504df162a5-kube-api-access-5xvmh\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.180360 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.180510 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.180646 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.180749 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.207267 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.282806 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.283645 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.283970 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.284021 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.284354 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvmh\" (UniqueName: \"kubernetes.io/projected/255a980b-28cb-4fe1-a9b7-3b504df162a5-kube-api-access-5xvmh\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.287834 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.288296 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.289115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.292799 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255a980b-28cb-4fe1-a9b7-3b504df162a5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.304864 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvmh\" (UniqueName: \"kubernetes.io/projected/255a980b-28cb-4fe1-a9b7-3b504df162a5-kube-api-access-5xvmh\") pod \"nova-cell1-novncproxy-0\" (UID: \"255a980b-28cb-4fe1-a9b7-3b504df162a5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.504445 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.793834 4834 generic.go:334] "Generic (PLEG): container finished" podID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerID="a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746" exitCode=143 Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.793936 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a0df633-b4a0-4147-b2ef-46982c4f59c7","Type":"ContainerDied","Data":"a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746"} Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.801064 4834 generic.go:334] "Generic (PLEG): container finished" podID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerID="8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496" exitCode=2 Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.801105 4834 generic.go:334] "Generic (PLEG): container finished" podID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerID="7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1" exitCode=0 Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.801902 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerDied","Data":"8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496"} Jan 30 21:40:18 crc kubenswrapper[4834]: I0130 21:40:18.801950 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerDied","Data":"7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1"} Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.019884 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 30 21:40:19 crc kubenswrapper[4834]: W0130 21:40:19.026613 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255a980b_28cb_4fe1_a9b7_3b504df162a5.slice/crio-c619ee2827ab9186843c133b3b187dc0966ccece107e9f1f4f1e86568f28c80c WatchSource:0}: Error finding container c619ee2827ab9186843c133b3b187dc0966ccece107e9f1f4f1e86568f28c80c: Status 404 returned error can't find the container with id c619ee2827ab9186843c133b3b187dc0966ccece107e9f1f4f1e86568f28c80c Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.545962 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4" path="/var/lib/kubelet/pods/4c7da8cd-b4e6-4aa8-9168-91637a2bfdd4/volumes" Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.816914 4834 generic.go:334] "Generic (PLEG): container finished" podID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerID="5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f" exitCode=0 Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.816975 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerDied","Data":"5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f"} Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.818770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"255a980b-28cb-4fe1-a9b7-3b504df162a5","Type":"ContainerStarted","Data":"5cf0fef96d074ffa4e5576e26d28ec5214a42ef541def940e6231ba1db000577"} Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.818793 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"255a980b-28cb-4fe1-a9b7-3b504df162a5","Type":"ContainerStarted","Data":"c619ee2827ab9186843c133b3b187dc0966ccece107e9f1f4f1e86568f28c80c"} Jan 30 21:40:19 crc kubenswrapper[4834]: I0130 21:40:19.884890 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8848685760000001 podStartE2EDuration="1.884868576s" podCreationTimestamp="2026-01-30 21:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:19.83649646 +0000 UTC m=+1470.989642588" watchObservedRunningTime="2026-01-30 21:40:19.884868576 +0000 UTC m=+1471.038014734" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.227091 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9vn2x" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" probeResult="failure" output=< Jan 30 21:40:21 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:40:21 crc kubenswrapper[4834]: > Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.370888 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.473508 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-combined-ca-bundle\") pod \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.473825 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-config-data\") pod \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.473865 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0df633-b4a0-4147-b2ef-46982c4f59c7-logs\") pod \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.473927 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m2vh\" (UniqueName: \"kubernetes.io/projected/6a0df633-b4a0-4147-b2ef-46982c4f59c7-kube-api-access-7m2vh\") pod \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\" (UID: \"6a0df633-b4a0-4147-b2ef-46982c4f59c7\") " Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.477736 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0df633-b4a0-4147-b2ef-46982c4f59c7-logs" (OuterVolumeSpecName: "logs") pod "6a0df633-b4a0-4147-b2ef-46982c4f59c7" (UID: "6a0df633-b4a0-4147-b2ef-46982c4f59c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.489094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0df633-b4a0-4147-b2ef-46982c4f59c7-kube-api-access-7m2vh" (OuterVolumeSpecName: "kube-api-access-7m2vh") pod "6a0df633-b4a0-4147-b2ef-46982c4f59c7" (UID: "6a0df633-b4a0-4147-b2ef-46982c4f59c7"). InnerVolumeSpecName "kube-api-access-7m2vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.509133 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a0df633-b4a0-4147-b2ef-46982c4f59c7" (UID: "6a0df633-b4a0-4147-b2ef-46982c4f59c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.539321 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-config-data" (OuterVolumeSpecName: "config-data") pod "6a0df633-b4a0-4147-b2ef-46982c4f59c7" (UID: "6a0df633-b4a0-4147-b2ef-46982c4f59c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.576956 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.577000 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0df633-b4a0-4147-b2ef-46982c4f59c7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.577012 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0df633-b4a0-4147-b2ef-46982c4f59c7-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.577024 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m2vh\" (UniqueName: \"kubernetes.io/projected/6a0df633-b4a0-4147-b2ef-46982c4f59c7-kube-api-access-7m2vh\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.837142 4834 generic.go:334] "Generic (PLEG): container finished" podID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerID="e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8" exitCode=0 Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.837287 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.837305 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a0df633-b4a0-4147-b2ef-46982c4f59c7","Type":"ContainerDied","Data":"e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8"} Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.837408 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a0df633-b4a0-4147-b2ef-46982c4f59c7","Type":"ContainerDied","Data":"5111edbe5a6fb4d612b02074787eb94c3c8be5aad85336271640032dc4b69f78"} Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.837438 4834 scope.go:117] "RemoveContainer" containerID="e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.863363 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.872071 4834 scope.go:117] "RemoveContainer" containerID="a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.876904 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.892319 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:21 crc kubenswrapper[4834]: E0130 21:40:21.893729 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-log" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.893759 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-log" Jan 30 21:40:21 crc kubenswrapper[4834]: E0130 21:40:21.893781 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-api" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.893790 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-api" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.894000 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-log" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.894009 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" containerName="nova-api-api" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.895087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.897922 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.898188 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.905010 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.911837 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.912015 4834 scope.go:117] "RemoveContainer" containerID="e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8" Jan 30 21:40:21 crc kubenswrapper[4834]: E0130 21:40:21.914900 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8\": container with ID starting with e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8 not found: ID does not exist" containerID="e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.914950 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8"} err="failed to get container status \"e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8\": rpc error: code = NotFound desc = could not find container \"e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8\": container with ID starting with e914833790f429ac5e1451172d982b4a954fbdde1838b9e4f1e05ef2bc62a2a8 not found: ID does not exist" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.914982 4834 scope.go:117] "RemoveContainer" containerID="a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746" Jan 30 21:40:21 crc kubenswrapper[4834]: E0130 21:40:21.916591 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746\": container with ID starting with a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746 not found: ID does not exist" containerID="a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.916643 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746"} err="failed to get container status \"a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746\": rpc error: code = NotFound desc = could not find container \"a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746\": container with ID starting with a07cc046d72cdb459153c2e5db806ae5436d68ca57b4895fdf062987f5c6f746 not found: ID does not exist" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.986224 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-config-data\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.986284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbkpx\" (UniqueName: \"kubernetes.io/projected/40b51700-418f-4552-839c-9af2f8903c7a-kube-api-access-qbkpx\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.986531 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b51700-418f-4552-839c-9af2f8903c7a-logs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.986598 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.986752 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:21 crc kubenswrapper[4834]: I0130 21:40:21.986879 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.088339 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.088428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.088464 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-config-data\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.088486 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbkpx\" (UniqueName: \"kubernetes.io/projected/40b51700-418f-4552-839c-9af2f8903c7a-kube-api-access-qbkpx\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.088539 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b51700-418f-4552-839c-9af2f8903c7a-logs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.088563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.089621 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b51700-418f-4552-839c-9af2f8903c7a-logs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.092288 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.092748 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.094620 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.099812 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-config-data\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.121637 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbkpx\" (UniqueName: \"kubernetes.io/projected/40b51700-418f-4552-839c-9af2f8903c7a-kube-api-access-qbkpx\") pod \"nova-api-0\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.216692 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.688240 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:22 crc kubenswrapper[4834]: I0130 21:40:22.856411 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b51700-418f-4552-839c-9af2f8903c7a","Type":"ContainerStarted","Data":"8ec2bbbb277f72fc98409342d7cac148b834736d9a6c308ace030e9e456aacb6"} Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.490079 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.504794 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-scripts\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522539 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2ztz\" (UniqueName: \"kubernetes.io/projected/9aecf756-d98a-4d9e-9e19-e1497ae773d7-kube-api-access-l2ztz\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522595 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-config-data\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-ceilometer-tls-certs\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522687 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-combined-ca-bundle\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522720 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-log-httpd\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522827 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-run-httpd\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.522847 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-sg-core-conf-yaml\") pod \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\" (UID: \"9aecf756-d98a-4d9e-9e19-e1497ae773d7\") " Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.524363 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.524654 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.536459 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-scripts" (OuterVolumeSpecName: "scripts") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.620058 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aecf756-d98a-4d9e-9e19-e1497ae773d7-kube-api-access-l2ztz" (OuterVolumeSpecName: "kube-api-access-l2ztz") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "kube-api-access-l2ztz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.700962 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.700993 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2ztz\" (UniqueName: \"kubernetes.io/projected/9aecf756-d98a-4d9e-9e19-e1497ae773d7-kube-api-access-l2ztz\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.701005 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.701019 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9aecf756-d98a-4d9e-9e19-e1497ae773d7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.739484 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.755808 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0df633-b4a0-4147-b2ef-46982c4f59c7" path="/var/lib/kubelet/pods/6a0df633-b4a0-4147-b2ef-46982c4f59c7/volumes" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.759499 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.810211 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.810242 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.822981 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.825596 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-config-data" (OuterVolumeSpecName: "config-data") pod "9aecf756-d98a-4d9e-9e19-e1497ae773d7" (UID: "9aecf756-d98a-4d9e-9e19-e1497ae773d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.868129 4834 generic.go:334] "Generic (PLEG): container finished" podID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerID="6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5" exitCode=0 Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.868207 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.868220 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerDied","Data":"6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5"} Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.868252 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9aecf756-d98a-4d9e-9e19-e1497ae773d7","Type":"ContainerDied","Data":"21cdc1cd5334ab4385962bad71044764a8957bd1878858e8fe32a42e3925cb90"} Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.868278 4834 scope.go:117] "RemoveContainer" containerID="5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.871213 4834 generic.go:334] "Generic (PLEG): container finished" podID="8e513111-c687-4c45-8262-7ce559c7decf" containerID="e41d8ddee34c397e7d949eb9a68e780c9a835c0c6f542f30abc6e93b93f3b631" exitCode=0 Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.871276 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" event={"ID":"8e513111-c687-4c45-8262-7ce559c7decf","Type":"ContainerDied","Data":"e41d8ddee34c397e7d949eb9a68e780c9a835c0c6f542f30abc6e93b93f3b631"} Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.880670 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b51700-418f-4552-839c-9af2f8903c7a","Type":"ContainerStarted","Data":"05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da"} Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.880713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b51700-418f-4552-839c-9af2f8903c7a","Type":"ContainerStarted","Data":"5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97"} Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.906833 4834 scope.go:117] "RemoveContainer" containerID="8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.910348 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.910326038 podStartE2EDuration="2.910326038s" podCreationTimestamp="2026-01-30 21:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:23.909703751 +0000 UTC m=+1475.062849889" watchObservedRunningTime="2026-01-30 21:40:23.910326038 +0000 UTC m=+1475.063472176" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.921141 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.921170 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aecf756-d98a-4d9e-9e19-e1497ae773d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.954198 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.968679 4834 scope.go:117] "RemoveContainer" containerID="6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.973970 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.991876 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:40:23 crc kubenswrapper[4834]: E0130 21:40:23.992288 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="proxy-httpd" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992299 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="proxy-httpd" Jan 30 21:40:23 crc kubenswrapper[4834]: E0130 21:40:23.992314 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="sg-core" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992319 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="sg-core" Jan 30 21:40:23 crc kubenswrapper[4834]: E0130 21:40:23.992334 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-central-agent" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992340 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-central-agent" Jan 30 21:40:23 crc kubenswrapper[4834]: E0130 21:40:23.992359 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-notification-agent" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992365 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-notification-agent" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992542 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="sg-core" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992552 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-notification-agent" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992569 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="proxy-httpd" Jan 30 21:40:23 crc kubenswrapper[4834]: I0130 21:40:23.992578 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" containerName="ceilometer-central-agent" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.000769 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.005130 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.005343 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.005554 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.022408 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-run-httpd\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023265 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq278\" (UniqueName: \"kubernetes.io/projected/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-kube-api-access-hq278\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023303 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-config-data\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023367 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023387 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-scripts\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023553 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023597 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-log-httpd\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.023628 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.034857 4834 scope.go:117] "RemoveContainer" containerID="7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.057238 4834 scope.go:117] "RemoveContainer" containerID="5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f" Jan 30 21:40:24 crc kubenswrapper[4834]: E0130 21:40:24.057758 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f\": container with ID starting with 5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f not found: ID does not exist" containerID="5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.057796 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f"} err="failed to get container status \"5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f\": rpc error: code = NotFound desc = could not find container \"5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f\": container with ID starting with 5738f9c3321953a4f99cc4e92c4d208193b5898dd31034820e1737f20f98389f not found: ID does not exist" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.057819 4834 scope.go:117] "RemoveContainer" containerID="8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496" Jan 30 21:40:24 crc kubenswrapper[4834]: E0130 21:40:24.058130 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496\": container with ID starting with 8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496 not found: ID does not exist" containerID="8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.058155 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496"} err="failed to get container status \"8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496\": rpc error: code = NotFound desc = could not find container \"8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496\": container with ID starting with 8f26e39893a3d542c95ec22016f32dd408d9f3b312b8ff33aaa38f686c6e7496 not found: ID does not exist" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.058167 4834 scope.go:117] "RemoveContainer" containerID="6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5" Jan 30 21:40:24 crc kubenswrapper[4834]: E0130 21:40:24.058513 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5\": container with ID starting with 6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5 not found: ID does not exist" containerID="6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.058541 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5"} err="failed to get container status \"6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5\": rpc error: code = NotFound desc = could not find container \"6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5\": container with ID starting with 6641c1a252f667260f96adb98ee16ae3f1b9a51f23167a8058146ed8181d67a5 not found: ID does not exist" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.058555 4834 scope.go:117] "RemoveContainer" containerID="7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1" Jan 30 21:40:24 crc kubenswrapper[4834]: E0130 21:40:24.058725 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1\": container with ID starting with 7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1 not found: ID does not exist" containerID="7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.058746 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1"} err="failed to get container status \"7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1\": rpc error: code = NotFound desc = could not find container \"7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1\": container with ID starting with 7d730122c405c18f1958a081be5089bfa3faca78c16bb71e8b1becf95be44cc1 not found: ID does not exist" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.124839 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-run-httpd\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.124901 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq278\" (UniqueName: \"kubernetes.io/projected/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-kube-api-access-hq278\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.124947 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-config-data\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.125027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.125048 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-scripts\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.125066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.125099 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-log-httpd\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.125129 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.125323 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-run-httpd\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.127484 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-log-httpd\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.129687 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.130345 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-config-data\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.130696 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.131626 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-scripts\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.133280 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.145100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq278\" (UniqueName: \"kubernetes.io/projected/cff0c742-7a39-4a5d-91b8-f4b6304b19ef-kube-api-access-hq278\") pod \"ceilometer-0\" (UID: \"cff0c742-7a39-4a5d-91b8-f4b6304b19ef\") " pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.322772 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.843266 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 30 21:40:24 crc kubenswrapper[4834]: I0130 21:40:24.891153 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cff0c742-7a39-4a5d-91b8-f4b6304b19ef","Type":"ContainerStarted","Data":"6fe67dca08af1f186e61ab296551121deb2d913692a2c07888b578bf8023291f"} Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.240568 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.302145 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.315126 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-66tdb"] Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.315566 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerName="dnsmasq-dns" containerID="cri-o://dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36" gracePeriod=10 Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.351693 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-config-data\") pod \"8e513111-c687-4c45-8262-7ce559c7decf\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.351943 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-combined-ca-bundle\") pod \"8e513111-c687-4c45-8262-7ce559c7decf\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.352009 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-scripts\") pod \"8e513111-c687-4c45-8262-7ce559c7decf\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.352041 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4t57\" (UniqueName: \"kubernetes.io/projected/8e513111-c687-4c45-8262-7ce559c7decf-kube-api-access-j4t57\") pod \"8e513111-c687-4c45-8262-7ce559c7decf\" (UID: \"8e513111-c687-4c45-8262-7ce559c7decf\") " Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.358004 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-scripts" (OuterVolumeSpecName: "scripts") pod "8e513111-c687-4c45-8262-7ce559c7decf" (UID: "8e513111-c687-4c45-8262-7ce559c7decf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.371611 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e513111-c687-4c45-8262-7ce559c7decf-kube-api-access-j4t57" (OuterVolumeSpecName: "kube-api-access-j4t57") pod "8e513111-c687-4c45-8262-7ce559c7decf" (UID: "8e513111-c687-4c45-8262-7ce559c7decf"). InnerVolumeSpecName "kube-api-access-j4t57". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.407686 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e513111-c687-4c45-8262-7ce559c7decf" (UID: "8e513111-c687-4c45-8262-7ce559c7decf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.417115 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-config-data" (OuterVolumeSpecName: "config-data") pod "8e513111-c687-4c45-8262-7ce559c7decf" (UID: "8e513111-c687-4c45-8262-7ce559c7decf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.458069 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.458109 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.458135 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e513111-c687-4c45-8262-7ce559c7decf-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.458147 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4t57\" (UniqueName: \"kubernetes.io/projected/8e513111-c687-4c45-8262-7ce559c7decf-kube-api-access-j4t57\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.553646 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aecf756-d98a-4d9e-9e19-e1497ae773d7" path="/var/lib/kubelet/pods/9aecf756-d98a-4d9e-9e19-e1497ae773d7/volumes" Jan 30 21:40:25 crc kubenswrapper[4834]: I0130 21:40:25.752528 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.868064 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-sb\") pod \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.868351 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-nb\") pod \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.868378 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-swift-storage-0\") pod \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.868474 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-config\") pod \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.868543 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-svc\") pod \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.868655 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swk76\" (UniqueName: \"kubernetes.io/projected/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-kube-api-access-swk76\") pod \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\" (UID: \"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9\") " Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.877440 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-kube-api-access-swk76" (OuterVolumeSpecName: "kube-api-access-swk76") pod "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" (UID: "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9"). InnerVolumeSpecName "kube-api-access-swk76". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.935043 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 21:40:26 crc kubenswrapper[4834]: E0130 21:40:25.935546 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e513111-c687-4c45-8262-7ce559c7decf" containerName="nova-cell1-conductor-db-sync" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.935568 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e513111-c687-4c45-8262-7ce559c7decf" containerName="nova-cell1-conductor-db-sync" Jan 30 21:40:26 crc kubenswrapper[4834]: E0130 21:40:25.935595 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerName="init" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.935602 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerName="init" Jan 30 21:40:26 crc kubenswrapper[4834]: E0130 21:40:25.935632 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerName="dnsmasq-dns" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.935642 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerName="dnsmasq-dns" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.935882 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerName="dnsmasq-dns" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.935911 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e513111-c687-4c45-8262-7ce559c7decf" containerName="nova-cell1-conductor-db-sync" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.939689 4834 generic.go:334] "Generic (PLEG): container finished" podID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" containerID="dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36" exitCode=0 Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.939861 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.942417 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" event={"ID":"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9","Type":"ContainerDied","Data":"dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36"} Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.942468 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-66tdb" event={"ID":"5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9","Type":"ContainerDied","Data":"1a3b18332c7ef272bf8d8b9b07830e5644afcb32d1ef7d3fe5df7d0c4222cf64"} Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.942522 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.943600 4834 scope.go:117] "RemoveContainer" containerID="dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.943701 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cff0c742-7a39-4a5d-91b8-f4b6304b19ef","Type":"ContainerStarted","Data":"e5c1435fb82966c9437b2a06010166ee266854c5550528928c2bd548b73707c0"} Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.960283 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.995324 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn85g\" (UniqueName: \"kubernetes.io/projected/026bbd14-5947-40ec-9c29-1c3153d1cfc2-kube-api-access-jn85g\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.995510 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026bbd14-5947-40ec-9c29-1c3153d1cfc2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.995566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbd14-5947-40ec-9c29-1c3153d1cfc2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.995767 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swk76\" (UniqueName: \"kubernetes.io/projected/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-kube-api-access-swk76\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.996353 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" (UID: "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.996774 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" (UID: "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:25.999375 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" (UID: "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.009306 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" event={"ID":"8e513111-c687-4c45-8262-7ce559c7decf","Type":"ContainerDied","Data":"6888c3b018e772b38a8797c15c13770c6ac2d2ce53418128dca2ac8c42590b18"} Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.009342 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6888c3b018e772b38a8797c15c13770c6ac2d2ce53418128dca2ac8c42590b18" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.009479 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkmbc" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.057059 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-config" (OuterVolumeSpecName: "config") pod "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" (UID: "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.057930 4834 scope.go:117] "RemoveContainer" containerID="d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.076311 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" (UID: "5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.090542 4834 scope.go:117] "RemoveContainer" containerID="dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36" Jan 30 21:40:26 crc kubenswrapper[4834]: E0130 21:40:26.090852 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36\": container with ID starting with dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36 not found: ID does not exist" containerID="dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.090877 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36"} err="failed to get container status \"dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36\": rpc error: code = NotFound desc = could not find container \"dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36\": container with ID starting with dad8b10eae78dd82a0a8503d0c96bed9c5e12f1542d2b7ad6e0007c318bf6c36 not found: ID does not exist" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.090895 4834 scope.go:117] "RemoveContainer" containerID="d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8" Jan 30 21:40:26 crc kubenswrapper[4834]: E0130 21:40:26.091671 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8\": container with ID starting with d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8 not found: ID does not exist" containerID="d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.091721 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8"} err="failed to get container status \"d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8\": rpc error: code = NotFound desc = could not find container \"d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8\": container with ID starting with d0327f2aa884b4ee2dc0c3ad12d480f495222b1759eebd9cb19bf7d808438cb8 not found: ID does not exist" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098534 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn85g\" (UniqueName: \"kubernetes.io/projected/026bbd14-5947-40ec-9c29-1c3153d1cfc2-kube-api-access-jn85g\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098656 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026bbd14-5947-40ec-9c29-1c3153d1cfc2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098701 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbd14-5947-40ec-9c29-1c3153d1cfc2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098857 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098871 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098882 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098893 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.098904 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.103172 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbd14-5947-40ec-9c29-1c3153d1cfc2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.108306 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/026bbd14-5947-40ec-9c29-1c3153d1cfc2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.114440 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn85g\" (UniqueName: \"kubernetes.io/projected/026bbd14-5947-40ec-9c29-1c3153d1cfc2-kube-api-access-jn85g\") pod \"nova-cell1-conductor-0\" (UID: \"026bbd14-5947-40ec-9c29-1c3153d1cfc2\") " pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.333210 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.477622 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-66tdb"] Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.489614 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-66tdb"] Jan 30 21:40:26 crc kubenswrapper[4834]: I0130 21:40:26.844330 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 30 21:40:27 crc kubenswrapper[4834]: I0130 21:40:27.032380 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"026bbd14-5947-40ec-9c29-1c3153d1cfc2","Type":"ContainerStarted","Data":"4ed90e6af57cc20d76d2cd8a8e7519e09b2cea24f8b54d9e30170d6660b54e76"} Jan 30 21:40:27 crc kubenswrapper[4834]: I0130 21:40:27.035732 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cff0c742-7a39-4a5d-91b8-f4b6304b19ef","Type":"ContainerStarted","Data":"00001790f1133f9133ea4a0a9d5073215794561d68c5b921e165cab65473928f"} Jan 30 21:40:27 crc kubenswrapper[4834]: I0130 21:40:27.550936 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9" path="/var/lib/kubelet/pods/5a9cd9a1-0f8e-4626-b69b-64872b7bc8d9/volumes" Jan 30 21:40:28 crc kubenswrapper[4834]: I0130 21:40:28.048043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"026bbd14-5947-40ec-9c29-1c3153d1cfc2","Type":"ContainerStarted","Data":"3369eab92764cdbabdf65b50576e07043dacf61e51ba15f4fdead950debef062"} Jan 30 21:40:28 crc kubenswrapper[4834]: I0130 21:40:28.048415 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:28 crc kubenswrapper[4834]: I0130 21:40:28.051026 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cff0c742-7a39-4a5d-91b8-f4b6304b19ef","Type":"ContainerStarted","Data":"4926298a2291c32fe9626c40cdd46f65880d29a1bb04131f51c7809e96c46382"} Jan 30 21:40:28 crc kubenswrapper[4834]: I0130 21:40:28.065993 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.065974219 podStartE2EDuration="3.065974219s" podCreationTimestamp="2026-01-30 21:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:28.062416738 +0000 UTC m=+1479.215562876" watchObservedRunningTime="2026-01-30 21:40:28.065974219 +0000 UTC m=+1479.219120357" Jan 30 21:40:28 crc kubenswrapper[4834]: I0130 21:40:28.504836 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:28 crc kubenswrapper[4834]: I0130 21:40:28.528844 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:29 crc kubenswrapper[4834]: I0130 21:40:29.092595 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 30 21:40:30 crc kubenswrapper[4834]: I0130 21:40:30.080899 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cff0c742-7a39-4a5d-91b8-f4b6304b19ef","Type":"ContainerStarted","Data":"5eb62aee44e02e8deecc7634c7239b0110490554ba12fec8f5ea8a7c0b95a076"} Jan 30 21:40:30 crc kubenswrapper[4834]: I0130 21:40:30.081296 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 30 21:40:30 crc kubenswrapper[4834]: I0130 21:40:30.111268 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.739300287 podStartE2EDuration="7.111242335s" podCreationTimestamp="2026-01-30 21:40:23 +0000 UTC" firstStartedPulling="2026-01-30 21:40:24.84595417 +0000 UTC m=+1475.999100308" lastFinishedPulling="2026-01-30 21:40:29.217896218 +0000 UTC m=+1480.371042356" observedRunningTime="2026-01-30 21:40:30.10470741 +0000 UTC m=+1481.257853588" watchObservedRunningTime="2026-01-30 21:40:30.111242335 +0000 UTC m=+1481.264388483" Jan 30 21:40:30 crc kubenswrapper[4834]: I0130 21:40:30.231463 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:40:30 crc kubenswrapper[4834]: I0130 21:40:30.296251 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:40:31 crc kubenswrapper[4834]: I0130 21:40:31.081002 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vn2x"] Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.098000 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9vn2x" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" containerID="cri-o://554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816" gracePeriod=2 Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.221506 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.222216 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.667468 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.796170 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-utilities\") pod \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.796299 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-catalog-content\") pod \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.796340 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-545jl\" (UniqueName: \"kubernetes.io/projected/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-kube-api-access-545jl\") pod \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\" (UID: \"eee11cf6-ede3-449f-8ca4-dbf77c6e6323\") " Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.797074 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-utilities" (OuterVolumeSpecName: "utilities") pod "eee11cf6-ede3-449f-8ca4-dbf77c6e6323" (UID: "eee11cf6-ede3-449f-8ca4-dbf77c6e6323"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.818314 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-kube-api-access-545jl" (OuterVolumeSpecName: "kube-api-access-545jl") pod "eee11cf6-ede3-449f-8ca4-dbf77c6e6323" (UID: "eee11cf6-ede3-449f-8ca4-dbf77c6e6323"). InnerVolumeSpecName "kube-api-access-545jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.840818 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eee11cf6-ede3-449f-8ca4-dbf77c6e6323" (UID: "eee11cf6-ede3-449f-8ca4-dbf77c6e6323"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.898672 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.898712 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:32 crc kubenswrapper[4834]: I0130 21:40:32.898725 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-545jl\" (UniqueName: \"kubernetes.io/projected/eee11cf6-ede3-449f-8ca4-dbf77c6e6323-kube-api-access-545jl\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.134669 4834 generic.go:334] "Generic (PLEG): container finished" podID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerID="554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816" exitCode=0 Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.135902 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vn2x" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.138645 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerDied","Data":"554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816"} Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.138731 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vn2x" event={"ID":"eee11cf6-ede3-449f-8ca4-dbf77c6e6323","Type":"ContainerDied","Data":"84210eca0fd0fc14946c50a645fa9c8ce6a9822519d616dfc21c434068f58700"} Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.138762 4834 scope.go:117] "RemoveContainer" containerID="554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.181136 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vn2x"] Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.183473 4834 scope.go:117] "RemoveContainer" containerID="dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.190668 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9vn2x"] Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.218904 4834 scope.go:117] "RemoveContainer" containerID="388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.230546 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.230543 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.323883 4834 scope.go:117] "RemoveContainer" containerID="554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816" Jan 30 21:40:33 crc kubenswrapper[4834]: E0130 21:40:33.324289 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816\": container with ID starting with 554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816 not found: ID does not exist" containerID="554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.324320 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816"} err="failed to get container status \"554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816\": rpc error: code = NotFound desc = could not find container \"554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816\": container with ID starting with 554662359a8f13346d627c8c9edc599d3d02cae6820a2a005884331c97774816 not found: ID does not exist" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.324338 4834 scope.go:117] "RemoveContainer" containerID="dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647" Jan 30 21:40:33 crc kubenswrapper[4834]: E0130 21:40:33.324698 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647\": container with ID starting with dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647 not found: ID does not exist" containerID="dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.324749 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647"} err="failed to get container status \"dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647\": rpc error: code = NotFound desc = could not find container \"dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647\": container with ID starting with dfa53ddd38b647536ac180585dda99bf3e1e681054e5f92de12801312013e647 not found: ID does not exist" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.324783 4834 scope.go:117] "RemoveContainer" containerID="388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab" Jan 30 21:40:33 crc kubenswrapper[4834]: E0130 21:40:33.325474 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab\": container with ID starting with 388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab not found: ID does not exist" containerID="388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.325519 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab"} err="failed to get container status \"388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab\": rpc error: code = NotFound desc = could not find container \"388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab\": container with ID starting with 388ce90da9a58cc4b2d58a53c0fb5c6e1d29cc3a3248abc8665323c08acefdab not found: ID does not exist" Jan 30 21:40:33 crc kubenswrapper[4834]: I0130 21:40:33.546291 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" path="/var/lib/kubelet/pods/eee11cf6-ede3-449f-8ca4-dbf77c6e6323/volumes" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.382039 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.985451 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8xqlg"] Jan 30 21:40:36 crc kubenswrapper[4834]: E0130 21:40:36.986256 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="extract-utilities" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.986283 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="extract-utilities" Jan 30 21:40:36 crc kubenswrapper[4834]: E0130 21:40:36.986297 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.986306 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" Jan 30 21:40:36 crc kubenswrapper[4834]: E0130 21:40:36.986327 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="extract-content" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.986335 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="extract-content" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.986613 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="eee11cf6-ede3-449f-8ca4-dbf77c6e6323" containerName="registry-server" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.988645 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.991625 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.991783 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 30 21:40:36 crc kubenswrapper[4834]: I0130 21:40:36.996664 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8xqlg"] Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.093687 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-scripts\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.093764 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.093851 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t7vm\" (UniqueName: \"kubernetes.io/projected/0bb78972-66ad-4912-b43d-40c181fda896-kube-api-access-4t7vm\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.093895 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-config-data\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.195316 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-scripts\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.195386 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.195494 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t7vm\" (UniqueName: \"kubernetes.io/projected/0bb78972-66ad-4912-b43d-40c181fda896-kube-api-access-4t7vm\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.195540 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-config-data\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.201913 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.211322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-scripts\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.211438 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-config-data\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.215921 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t7vm\" (UniqueName: \"kubernetes.io/projected/0bb78972-66ad-4912-b43d-40c181fda896-kube-api-access-4t7vm\") pod \"nova-cell1-cell-mapping-8xqlg\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.310047 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:37 crc kubenswrapper[4834]: I0130 21:40:37.768865 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8xqlg"] Jan 30 21:40:38 crc kubenswrapper[4834]: I0130 21:40:38.190349 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8xqlg" event={"ID":"0bb78972-66ad-4912-b43d-40c181fda896","Type":"ContainerStarted","Data":"71c249264e33b8778c50b76b46f62f8e5296fe02023a413d18833ac4fa4484fa"} Jan 30 21:40:38 crc kubenswrapper[4834]: I0130 21:40:38.190738 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8xqlg" event={"ID":"0bb78972-66ad-4912-b43d-40c181fda896","Type":"ContainerStarted","Data":"b64e67194281abb6b65ac26b049dd69754ebd37b0c85cc45d553df88d6bfc148"} Jan 30 21:40:38 crc kubenswrapper[4834]: I0130 21:40:38.227530 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8xqlg" podStartSLOduration=2.227508207 podStartE2EDuration="2.227508207s" podCreationTimestamp="2026-01-30 21:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:38.217544205 +0000 UTC m=+1489.370690343" watchObservedRunningTime="2026-01-30 21:40:38.227508207 +0000 UTC m=+1489.380654355" Jan 30 21:40:42 crc kubenswrapper[4834]: I0130 21:40:42.231120 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:40:42 crc kubenswrapper[4834]: I0130 21:40:42.232531 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:40:42 crc kubenswrapper[4834]: I0130 21:40:42.232630 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:40:42 crc kubenswrapper[4834]: I0130 21:40:42.242437 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:40:43 crc kubenswrapper[4834]: I0130 21:40:43.241846 4834 generic.go:334] "Generic (PLEG): container finished" podID="0bb78972-66ad-4912-b43d-40c181fda896" containerID="71c249264e33b8778c50b76b46f62f8e5296fe02023a413d18833ac4fa4484fa" exitCode=0 Jan 30 21:40:43 crc kubenswrapper[4834]: I0130 21:40:43.241963 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8xqlg" event={"ID":"0bb78972-66ad-4912-b43d-40c181fda896","Type":"ContainerDied","Data":"71c249264e33b8778c50b76b46f62f8e5296fe02023a413d18833ac4fa4484fa"} Jan 30 21:40:43 crc kubenswrapper[4834]: I0130 21:40:43.242757 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:40:43 crc kubenswrapper[4834]: I0130 21:40:43.252650 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.700419 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.765550 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-scripts\") pod \"0bb78972-66ad-4912-b43d-40c181fda896\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.766456 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-combined-ca-bundle\") pod \"0bb78972-66ad-4912-b43d-40c181fda896\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.766490 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t7vm\" (UniqueName: \"kubernetes.io/projected/0bb78972-66ad-4912-b43d-40c181fda896-kube-api-access-4t7vm\") pod \"0bb78972-66ad-4912-b43d-40c181fda896\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.766580 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-config-data\") pod \"0bb78972-66ad-4912-b43d-40c181fda896\" (UID: \"0bb78972-66ad-4912-b43d-40c181fda896\") " Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.773804 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-scripts" (OuterVolumeSpecName: "scripts") pod "0bb78972-66ad-4912-b43d-40c181fda896" (UID: "0bb78972-66ad-4912-b43d-40c181fda896"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.774044 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb78972-66ad-4912-b43d-40c181fda896-kube-api-access-4t7vm" (OuterVolumeSpecName: "kube-api-access-4t7vm") pod "0bb78972-66ad-4912-b43d-40c181fda896" (UID: "0bb78972-66ad-4912-b43d-40c181fda896"). InnerVolumeSpecName "kube-api-access-4t7vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.799146 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bb78972-66ad-4912-b43d-40c181fda896" (UID: "0bb78972-66ad-4912-b43d-40c181fda896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.803567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-config-data" (OuterVolumeSpecName: "config-data") pod "0bb78972-66ad-4912-b43d-40c181fda896" (UID: "0bb78972-66ad-4912-b43d-40c181fda896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.868670 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.868705 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t7vm\" (UniqueName: \"kubernetes.io/projected/0bb78972-66ad-4912-b43d-40c181fda896-kube-api-access-4t7vm\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.868718 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:44 crc kubenswrapper[4834]: I0130 21:40:44.868728 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb78972-66ad-4912-b43d-40c181fda896-scripts\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.263851 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8xqlg" Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.263876 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8xqlg" event={"ID":"0bb78972-66ad-4912-b43d-40c181fda896","Type":"ContainerDied","Data":"b64e67194281abb6b65ac26b049dd69754ebd37b0c85cc45d553df88d6bfc148"} Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.263927 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b64e67194281abb6b65ac26b049dd69754ebd37b0c85cc45d553df88d6bfc148" Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.455466 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.465619 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.465868 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" containerName="nova-scheduler-scheduler" containerID="cri-o://11c86a1a9b21d79c27f67a07979fefd7141f01841f1eb423d4ca68a4e4af6252" gracePeriod=30 Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.543568 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.543815 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-log" containerID="cri-o://97aec49a7aa3d24dd34f1df1e46188d3ef201a34cb547fd224d20e3dc3b9c414" gracePeriod=30 Jan 30 21:40:45 crc kubenswrapper[4834]: I0130 21:40:45.543895 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-metadata" containerID="cri-o://a818a9d1e9689de9092288846ee127a0de243ae4a6f954783929b829e2320d6a" gracePeriod=30 Jan 30 21:40:46 crc kubenswrapper[4834]: I0130 21:40:46.274988 4834 generic.go:334] "Generic (PLEG): container finished" podID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerID="97aec49a7aa3d24dd34f1df1e46188d3ef201a34cb547fd224d20e3dc3b9c414" exitCode=143 Jan 30 21:40:46 crc kubenswrapper[4834]: I0130 21:40:46.275057 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3dec121-36ef-4d13-bd27-7a691e2a89c2","Type":"ContainerDied","Data":"97aec49a7aa3d24dd34f1df1e46188d3ef201a34cb547fd224d20e3dc3b9c414"} Jan 30 21:40:46 crc kubenswrapper[4834]: I0130 21:40:46.275412 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-log" containerID="cri-o://5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97" gracePeriod=30 Jan 30 21:40:46 crc kubenswrapper[4834]: I0130 21:40:46.275489 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-api" containerID="cri-o://05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da" gracePeriod=30 Jan 30 21:40:47 crc kubenswrapper[4834]: I0130 21:40:47.288619 4834 generic.go:334] "Generic (PLEG): container finished" podID="40b51700-418f-4552-839c-9af2f8903c7a" containerID="5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97" exitCode=143 Jan 30 21:40:47 crc kubenswrapper[4834]: I0130 21:40:47.288662 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b51700-418f-4552-839c-9af2f8903c7a","Type":"ContainerDied","Data":"5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97"} Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.302067 4834 generic.go:334] "Generic (PLEG): container finished" podID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" containerID="11c86a1a9b21d79c27f67a07979fefd7141f01841f1eb423d4ca68a4e4af6252" exitCode=0 Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.302133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79","Type":"ContainerDied","Data":"11c86a1a9b21d79c27f67a07979fefd7141f01841f1eb423d4ca68a4e4af6252"} Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.563209 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.648606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxvmv\" (UniqueName: \"kubernetes.io/projected/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-kube-api-access-rxvmv\") pod \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.648747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-combined-ca-bundle\") pod \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.648828 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-config-data\") pod \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\" (UID: \"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79\") " Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.654676 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-kube-api-access-rxvmv" (OuterVolumeSpecName: "kube-api-access-rxvmv") pod "05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" (UID: "05abe6f8-7cb5-4f41-aa26-cb06d63e3c79"). InnerVolumeSpecName "kube-api-access-rxvmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.675534 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" (UID: "05abe6f8-7cb5-4f41-aa26-cb06d63e3c79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.676310 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:43692->10.217.0.210:8775: read: connection reset by peer" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.676341 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:43706->10.217.0.210:8775: read: connection reset by peer" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.683788 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-config-data" (OuterVolumeSpecName: "config-data") pod "05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" (UID: "05abe6f8-7cb5-4f41-aa26-cb06d63e3c79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.752020 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxvmv\" (UniqueName: \"kubernetes.io/projected/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-kube-api-access-rxvmv\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.752062 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:48 crc kubenswrapper[4834]: I0130 21:40:48.752079 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.575949 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.577259 4834 generic.go:334] "Generic (PLEG): container finished" podID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerID="a818a9d1e9689de9092288846ee127a0de243ae4a6f954783929b829e2320d6a" exitCode=0 Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.604696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"05abe6f8-7cb5-4f41-aa26-cb06d63e3c79","Type":"ContainerDied","Data":"0ed12a1f5bfe81c6b31caaa69ec05400b0508b480023ef69ccf551348457fffa"} Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.605071 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3dec121-36ef-4d13-bd27-7a691e2a89c2","Type":"ContainerDied","Data":"a818a9d1e9689de9092288846ee127a0de243ae4a6f954783929b829e2320d6a"} Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.605162 4834 scope.go:117] "RemoveContainer" containerID="11c86a1a9b21d79c27f67a07979fefd7141f01841f1eb423d4ca68a4e4af6252" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.681390 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.763844 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dec121-36ef-4d13-bd27-7a691e2a89c2-logs\") pod \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.763953 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-config-data\") pod \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.764045 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-nova-metadata-tls-certs\") pod \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.764097 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-combined-ca-bundle\") pod \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.764187 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85572\" (UniqueName: \"kubernetes.io/projected/e3dec121-36ef-4d13-bd27-7a691e2a89c2-kube-api-access-85572\") pod \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\" (UID: \"e3dec121-36ef-4d13-bd27-7a691e2a89c2\") " Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.764837 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3dec121-36ef-4d13-bd27-7a691e2a89c2-logs" (OuterVolumeSpecName: "logs") pod "e3dec121-36ef-4d13-bd27-7a691e2a89c2" (UID: "e3dec121-36ef-4d13-bd27-7a691e2a89c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.768579 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3dec121-36ef-4d13-bd27-7a691e2a89c2-kube-api-access-85572" (OuterVolumeSpecName: "kube-api-access-85572") pod "e3dec121-36ef-4d13-bd27-7a691e2a89c2" (UID: "e3dec121-36ef-4d13-bd27-7a691e2a89c2"). InnerVolumeSpecName "kube-api-access-85572". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.808674 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3dec121-36ef-4d13-bd27-7a691e2a89c2" (UID: "e3dec121-36ef-4d13-bd27-7a691e2a89c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.837555 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-config-data" (OuterVolumeSpecName: "config-data") pod "e3dec121-36ef-4d13-bd27-7a691e2a89c2" (UID: "e3dec121-36ef-4d13-bd27-7a691e2a89c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.841747 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e3dec121-36ef-4d13-bd27-7a691e2a89c2" (UID: "e3dec121-36ef-4d13-bd27-7a691e2a89c2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.866006 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85572\" (UniqueName: \"kubernetes.io/projected/e3dec121-36ef-4d13-bd27-7a691e2a89c2-kube-api-access-85572\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.866045 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3dec121-36ef-4d13-bd27-7a691e2a89c2-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.866058 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.866070 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:49 crc kubenswrapper[4834]: I0130 21:40:49.866084 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3dec121-36ef-4d13-bd27-7a691e2a89c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.295443 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.377975 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-config-data\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.378313 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-combined-ca-bundle\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.378450 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.378498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbkpx\" (UniqueName: \"kubernetes.io/projected/40b51700-418f-4552-839c-9af2f8903c7a-kube-api-access-qbkpx\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.378528 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b51700-418f-4552-839c-9af2f8903c7a-logs\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.378578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-internal-tls-certs\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.381573 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b51700-418f-4552-839c-9af2f8903c7a-logs" (OuterVolumeSpecName: "logs") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.389875 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b51700-418f-4552-839c-9af2f8903c7a-kube-api-access-qbkpx" (OuterVolumeSpecName: "kube-api-access-qbkpx") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a"). InnerVolumeSpecName "kube-api-access-qbkpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.417086 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-config-data" (OuterVolumeSpecName: "config-data") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.436186 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.451735 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs podName:40b51700-418f-4552-839c-9af2f8903c7a nodeName:}" failed. No retries permitted until 2026-01-30 21:40:50.951713555 +0000 UTC m=+1502.104859683 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a") : error deleting /var/lib/kubelet/pods/40b51700-418f-4552-839c-9af2f8903c7a/volume-subpaths: remove /var/lib/kubelet/pods/40b51700-418f-4552-839c-9af2f8903c7a/volume-subpaths: no such file or directory Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.454037 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.482285 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.482317 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.482327 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.482337 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbkpx\" (UniqueName: \"kubernetes.io/projected/40b51700-418f-4552-839c-9af2f8903c7a-kube-api-access-qbkpx\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.482347 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b51700-418f-4552-839c-9af2f8903c7a-logs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.589617 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.589615 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3dec121-36ef-4d13-bd27-7a691e2a89c2","Type":"ContainerDied","Data":"7027bf338076c899a9424e5e5699d91b3be19155ff998046ba5d88fc9f2c6a35"} Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.589745 4834 scope.go:117] "RemoveContainer" containerID="a818a9d1e9689de9092288846ee127a0de243ae4a6f954783929b829e2320d6a" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.591973 4834 generic.go:334] "Generic (PLEG): container finished" podID="40b51700-418f-4552-839c-9af2f8903c7a" containerID="05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da" exitCode=0 Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.592009 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.592041 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b51700-418f-4552-839c-9af2f8903c7a","Type":"ContainerDied","Data":"05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da"} Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.592081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"40b51700-418f-4552-839c-9af2f8903c7a","Type":"ContainerDied","Data":"8ec2bbbb277f72fc98409342d7cac148b834736d9a6c308ace030e9e456aacb6"} Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.628026 4834 scope.go:117] "RemoveContainer" containerID="97aec49a7aa3d24dd34f1df1e46188d3ef201a34cb547fd224d20e3dc3b9c414" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.633920 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.652199 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.660303 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.662714 4834 scope.go:117] "RemoveContainer" containerID="05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.663650 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-api" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.663678 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-api" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.663706 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb78972-66ad-4912-b43d-40c181fda896" containerName="nova-manage" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.663714 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb78972-66ad-4912-b43d-40c181fda896" containerName="nova-manage" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.663747 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-log" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.663758 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-log" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.663832 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" containerName="nova-scheduler-scheduler" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.663841 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" containerName="nova-scheduler-scheduler" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.663858 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-log" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.663866 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-log" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.663885 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-metadata" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.663894 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-metadata" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.668045 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-metadata" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.668107 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" containerName="nova-scheduler-scheduler" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.668365 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb78972-66ad-4912-b43d-40c181fda896" containerName="nova-manage" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.668420 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-api" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.668444 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b51700-418f-4552-839c-9af2f8903c7a" containerName="nova-api-log" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.668469 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" containerName="nova-metadata-log" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.673483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.675686 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.688636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.688745 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-config-data\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.688823 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-logs\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.688888 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w48zj\" (UniqueName: \"kubernetes.io/projected/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-kube-api-access-w48zj\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.689025 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.689454 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.701079 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.732639 4834 scope.go:117] "RemoveContainer" containerID="5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.757918 4834 scope.go:117] "RemoveContainer" containerID="05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.758407 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da\": container with ID starting with 05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da not found: ID does not exist" containerID="05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.758451 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da"} err="failed to get container status \"05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da\": rpc error: code = NotFound desc = could not find container \"05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da\": container with ID starting with 05f90dd55f8a9f3c979f30ad4f027c6334b873fbec8e9b2b70aef634d1fab0da not found: ID does not exist" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.758480 4834 scope.go:117] "RemoveContainer" containerID="5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97" Jan 30 21:40:50 crc kubenswrapper[4834]: E0130 21:40:50.758947 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97\": container with ID starting with 5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97 not found: ID does not exist" containerID="5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.758991 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97"} err="failed to get container status \"5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97\": rpc error: code = NotFound desc = could not find container \"5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97\": container with ID starting with 5d82975e749053012bd8ff4fd45339ec21dd82c8a71316badeac1fab89cf2e97 not found: ID does not exist" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.790330 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w48zj\" (UniqueName: \"kubernetes.io/projected/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-kube-api-access-w48zj\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.790451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.790513 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.790561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-config-data\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.790584 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-logs\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.790961 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-logs\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.795456 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-config-data\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.797175 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.797205 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.809026 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w48zj\" (UniqueName: \"kubernetes.io/projected/cafa3f48-a62c-47fe-a1a2-d5bc73c1d944-kube-api-access-w48zj\") pod \"nova-metadata-0\" (UID: \"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944\") " pod="openstack/nova-metadata-0" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.993118 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs\") pod \"40b51700-418f-4552-839c-9af2f8903c7a\" (UID: \"40b51700-418f-4552-839c-9af2f8903c7a\") " Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.997894 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "40b51700-418f-4552-839c-9af2f8903c7a" (UID: "40b51700-418f-4552-839c-9af2f8903c7a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:40:50 crc kubenswrapper[4834]: I0130 21:40:50.998526 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.070161 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kvtfv"] Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.075149 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.089887 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kvtfv"] Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.095850 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40b51700-418f-4552-839c-9af2f8903c7a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.197217 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-utilities\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.197537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-catalog-content\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.197710 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n4nk\" (UniqueName: \"kubernetes.io/projected/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-kube-api-access-7n4nk\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.245254 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.269557 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.278604 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.280066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.283306 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.283553 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.288158 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.288832 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.298850 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n4nk\" (UniqueName: \"kubernetes.io/projected/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-kube-api-access-7n4nk\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.298911 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-utilities\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.298931 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-catalog-content\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.299541 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-utilities\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.299711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-catalog-content\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.317640 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n4nk\" (UniqueName: \"kubernetes.io/projected/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-kube-api-access-7n4nk\") pod \"redhat-operators-kvtfv\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.400523 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-config-data\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.400587 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.400742 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-logs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.400786 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96x5t\" (UniqueName: \"kubernetes.io/projected/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-kube-api-access-96x5t\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.400848 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.400919 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.473989 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.502372 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.502486 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.502519 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-config-data\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.502572 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.502676 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-logs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.502706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96x5t\" (UniqueName: \"kubernetes.io/projected/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-kube-api-access-96x5t\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.505502 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-logs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.507618 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-config-data\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.507617 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-public-tls-certs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.511116 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.511356 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.521238 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96x5t\" (UniqueName: \"kubernetes.io/projected/f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0-kube-api-access-96x5t\") pod \"nova-api-0\" (UID: \"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0\") " pod="openstack/nova-api-0" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.552833 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b51700-418f-4552-839c-9af2f8903c7a" path="/var/lib/kubelet/pods/40b51700-418f-4552-839c-9af2f8903c7a/volumes" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.553823 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3dec121-36ef-4d13-bd27-7a691e2a89c2" path="/var/lib/kubelet/pods/e3dec121-36ef-4d13-bd27-7a691e2a89c2/volumes" Jan 30 21:40:51 crc kubenswrapper[4834]: I0130 21:40:51.603580 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.168815 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.299732 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.310225 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kvtfv"] Jan 30 21:40:52 crc kubenswrapper[4834]: W0130 21:40:52.318058 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf92f75b4_8d07_4f85_9d0d_2c3d5e4497b0.slice/crio-d37c794393b1533b5cfc83fa26ed56342d67b818adc58f797a50cf9bdfc08da5 WatchSource:0}: Error finding container d37c794393b1533b5cfc83fa26ed56342d67b818adc58f797a50cf9bdfc08da5: Status 404 returned error can't find the container with id d37c794393b1533b5cfc83fa26ed56342d67b818adc58f797a50cf9bdfc08da5 Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.634031 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944","Type":"ContainerStarted","Data":"67c5273f2359be4e44368d519dd90e3e6028a1999ff18172db3bfdfcf2cc2aec"} Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.634077 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944","Type":"ContainerStarted","Data":"41f8694d294208ae6e4d05f71015be405e81976955ce6ab77d83ca78d9431e96"} Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.642548 4834 generic.go:334] "Generic (PLEG): container finished" podID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerID="ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd" exitCode=0 Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.642627 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerDied","Data":"ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd"} Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.642652 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerStarted","Data":"ba27fd0f9ad3f570701b86deb3159ecca09beaa024287daf4dae3aed7e514dae"} Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.651748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0","Type":"ContainerStarted","Data":"62cc84cb8933e2686b75d0a655fdb87980e837cba7fd9583d21908c7f7eb0972"} Jan 30 21:40:52 crc kubenswrapper[4834]: I0130 21:40:52.651796 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0","Type":"ContainerStarted","Data":"d37c794393b1533b5cfc83fa26ed56342d67b818adc58f797a50cf9bdfc08da5"} Jan 30 21:40:53 crc kubenswrapper[4834]: I0130 21:40:53.663415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerStarted","Data":"8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c"} Jan 30 21:40:53 crc kubenswrapper[4834]: I0130 21:40:53.665073 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0","Type":"ContainerStarted","Data":"f48385ccbf341b0ccd6b7df3bca4776373ff4313a91b1be97c0ad0eb1fc54fa7"} Jan 30 21:40:53 crc kubenswrapper[4834]: I0130 21:40:53.669725 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cafa3f48-a62c-47fe-a1a2-d5bc73c1d944","Type":"ContainerStarted","Data":"6d02cc5f0473301228e8078b285f7aed2465c609bb105252b50a79d4b481b9fb"} Jan 30 21:40:53 crc kubenswrapper[4834]: I0130 21:40:53.710777 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.710753992 podStartE2EDuration="2.710753992s" podCreationTimestamp="2026-01-30 21:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:53.709185218 +0000 UTC m=+1504.862331356" watchObservedRunningTime="2026-01-30 21:40:53.710753992 +0000 UTC m=+1504.863900160" Jan 30 21:40:53 crc kubenswrapper[4834]: I0130 21:40:53.731147 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.730927961 podStartE2EDuration="3.730927961s" podCreationTimestamp="2026-01-30 21:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:40:53.727531346 +0000 UTC m=+1504.880677474" watchObservedRunningTime="2026-01-30 21:40:53.730927961 +0000 UTC m=+1504.884074099" Jan 30 21:40:54 crc kubenswrapper[4834]: I0130 21:40:54.345781 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 30 21:40:54 crc kubenswrapper[4834]: I0130 21:40:54.680746 4834 generic.go:334] "Generic (PLEG): container finished" podID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerID="8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c" exitCode=0 Jan 30 21:40:54 crc kubenswrapper[4834]: I0130 21:40:54.680902 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerDied","Data":"8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c"} Jan 30 21:40:55 crc kubenswrapper[4834]: I0130 21:40:55.696583 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerStarted","Data":"0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71"} Jan 30 21:40:55 crc kubenswrapper[4834]: I0130 21:40:55.724265 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kvtfv" podStartSLOduration=2.254518728 podStartE2EDuration="4.724244275s" podCreationTimestamp="2026-01-30 21:40:51 +0000 UTC" firstStartedPulling="2026-01-30 21:40:52.645029451 +0000 UTC m=+1503.798175589" lastFinishedPulling="2026-01-30 21:40:55.114754998 +0000 UTC m=+1506.267901136" observedRunningTime="2026-01-30 21:40:55.722831095 +0000 UTC m=+1506.875977243" watchObservedRunningTime="2026-01-30 21:40:55.724244275 +0000 UTC m=+1506.877390413" Jan 30 21:40:55 crc kubenswrapper[4834]: I0130 21:40:55.999659 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:40:55 crc kubenswrapper[4834]: I0130 21:40:55.999710 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 30 21:41:00 crc kubenswrapper[4834]: I0130 21:41:00.998891 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:41:01 crc kubenswrapper[4834]: I0130 21:41:01.000265 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 30 21:41:01 crc kubenswrapper[4834]: I0130 21:41:01.474499 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:41:01 crc kubenswrapper[4834]: I0130 21:41:01.474838 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:41:01 crc kubenswrapper[4834]: I0130 21:41:01.603922 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:41:01 crc kubenswrapper[4834]: I0130 21:41:01.603973 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 30 21:41:02 crc kubenswrapper[4834]: I0130 21:41:02.011520 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cafa3f48-a62c-47fe-a1a2-d5bc73c1d944" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:41:02 crc kubenswrapper[4834]: I0130 21:41:02.011538 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cafa3f48-a62c-47fe-a1a2-d5bc73c1d944" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:41:02 crc kubenswrapper[4834]: I0130 21:41:02.525005 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kvtfv" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="registry-server" probeResult="failure" output=< Jan 30 21:41:02 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:41:02 crc kubenswrapper[4834]: > Jan 30 21:41:02 crc kubenswrapper[4834]: I0130 21:41:02.615696 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:41:02 crc kubenswrapper[4834]: I0130 21:41:02.615765 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 30 21:41:04 crc kubenswrapper[4834]: I0130 21:41:04.161431 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:41:04 crc kubenswrapper[4834]: I0130 21:41:04.161490 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.009320 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.009977 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.039953 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.040760 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.522928 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.575155 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.611616 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.612197 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.613649 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.617887 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.758984 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kvtfv"] Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.877719 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 30 21:41:11 crc kubenswrapper[4834]: I0130 21:41:11.884156 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 30 21:41:12 crc kubenswrapper[4834]: I0130 21:41:12.885975 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kvtfv" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="registry-server" containerID="cri-o://0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71" gracePeriod=2 Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.457321 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.589147 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-catalog-content\") pod \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.589196 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n4nk\" (UniqueName: \"kubernetes.io/projected/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-kube-api-access-7n4nk\") pod \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.589395 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-utilities\") pod \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\" (UID: \"1ed4a3f8-3fd7-4a0b-af10-e646105494a2\") " Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.590210 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-utilities" (OuterVolumeSpecName: "utilities") pod "1ed4a3f8-3fd7-4a0b-af10-e646105494a2" (UID: "1ed4a3f8-3fd7-4a0b-af10-e646105494a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.592835 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.595734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-kube-api-access-7n4nk" (OuterVolumeSpecName: "kube-api-access-7n4nk") pod "1ed4a3f8-3fd7-4a0b-af10-e646105494a2" (UID: "1ed4a3f8-3fd7-4a0b-af10-e646105494a2"). InnerVolumeSpecName "kube-api-access-7n4nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.695898 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ed4a3f8-3fd7-4a0b-af10-e646105494a2" (UID: "1ed4a3f8-3fd7-4a0b-af10-e646105494a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.699918 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.699951 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n4nk\" (UniqueName: \"kubernetes.io/projected/1ed4a3f8-3fd7-4a0b-af10-e646105494a2-kube-api-access-7n4nk\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.897154 4834 generic.go:334] "Generic (PLEG): container finished" podID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerID="0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71" exitCode=0 Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.897214 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kvtfv" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.897250 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerDied","Data":"0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71"} Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.897301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kvtfv" event={"ID":"1ed4a3f8-3fd7-4a0b-af10-e646105494a2","Type":"ContainerDied","Data":"ba27fd0f9ad3f570701b86deb3159ecca09beaa024287daf4dae3aed7e514dae"} Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.897326 4834 scope.go:117] "RemoveContainer" containerID="0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.925503 4834 scope.go:117] "RemoveContainer" containerID="8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.944551 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kvtfv"] Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.962581 4834 scope.go:117] "RemoveContainer" containerID="ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd" Jan 30 21:41:13 crc kubenswrapper[4834]: I0130 21:41:13.963343 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kvtfv"] Jan 30 21:41:14 crc kubenswrapper[4834]: I0130 21:41:14.011193 4834 scope.go:117] "RemoveContainer" containerID="0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71" Jan 30 21:41:14 crc kubenswrapper[4834]: E0130 21:41:14.012462 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71\": container with ID starting with 0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71 not found: ID does not exist" containerID="0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71" Jan 30 21:41:14 crc kubenswrapper[4834]: I0130 21:41:14.012508 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71"} err="failed to get container status \"0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71\": rpc error: code = NotFound desc = could not find container \"0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71\": container with ID starting with 0129d682e26d68b0a643fd4e7e14f4d406a8654d9dc3b0ef2a78ccf82cecbd71 not found: ID does not exist" Jan 30 21:41:14 crc kubenswrapper[4834]: I0130 21:41:14.012538 4834 scope.go:117] "RemoveContainer" containerID="8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c" Jan 30 21:41:14 crc kubenswrapper[4834]: E0130 21:41:14.012867 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c\": container with ID starting with 8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c not found: ID does not exist" containerID="8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c" Jan 30 21:41:14 crc kubenswrapper[4834]: I0130 21:41:14.012907 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c"} err="failed to get container status \"8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c\": rpc error: code = NotFound desc = could not find container \"8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c\": container with ID starting with 8264f1b52f0a283d5297481be072abc7efcde42c4cb5ca107365979bebef966c not found: ID does not exist" Jan 30 21:41:14 crc kubenswrapper[4834]: I0130 21:41:14.012932 4834 scope.go:117] "RemoveContainer" containerID="ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd" Jan 30 21:41:14 crc kubenswrapper[4834]: E0130 21:41:14.013255 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd\": container with ID starting with ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd not found: ID does not exist" containerID="ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd" Jan 30 21:41:14 crc kubenswrapper[4834]: I0130 21:41:14.013279 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd"} err="failed to get container status \"ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd\": rpc error: code = NotFound desc = could not find container \"ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd\": container with ID starting with ff79d5b75a9a518f40077210f07f9f1f25471dccc1833efb68ba85b4c10c02fd not found: ID does not exist" Jan 30 21:41:15 crc kubenswrapper[4834]: I0130 21:41:15.549067 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" path="/var/lib/kubelet/pods/1ed4a3f8-3fd7-4a0b-af10-e646105494a2/volumes" Jan 30 21:41:19 crc kubenswrapper[4834]: I0130 21:41:19.672726 4834 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod05abe6f8-7cb5-4f41-aa26-cb06d63e3c79"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod05abe6f8-7cb5-4f41-aa26-cb06d63e3c79] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05abe6f8_7cb5_4f41_aa26_cb06d63e3c79.slice" Jan 30 21:41:19 crc kubenswrapper[4834]: E0130 21:41:19.673460 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod05abe6f8-7cb5-4f41-aa26-cb06d63e3c79] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod05abe6f8-7cb5-4f41-aa26-cb06d63e3c79] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05abe6f8_7cb5_4f41_aa26_cb06d63e3c79.slice" pod="openstack/nova-scheduler-0" podUID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" Jan 30 21:41:19 crc kubenswrapper[4834]: I0130 21:41:19.990759 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.063312 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.075534 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.088805 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:41:20 crc kubenswrapper[4834]: E0130 21:41:20.089686 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="registry-server" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.089721 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="registry-server" Jan 30 21:41:20 crc kubenswrapper[4834]: E0130 21:41:20.089758 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="extract-utilities" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.089776 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="extract-utilities" Jan 30 21:41:20 crc kubenswrapper[4834]: E0130 21:41:20.089816 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="extract-content" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.089833 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="extract-content" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.090281 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed4a3f8-3fd7-4a0b-af10-e646105494a2" containerName="registry-server" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.093954 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.100095 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.100735 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.245575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmhj\" (UniqueName: \"kubernetes.io/projected/b8c57ee1-252e-4288-9af3-18e8bc12d143-kube-api-access-5cmhj\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.245661 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c57ee1-252e-4288-9af3-18e8bc12d143-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.245889 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c57ee1-252e-4288-9af3-18e8bc12d143-config-data\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.348137 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c57ee1-252e-4288-9af3-18e8bc12d143-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.348210 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c57ee1-252e-4288-9af3-18e8bc12d143-config-data\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.348350 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmhj\" (UniqueName: \"kubernetes.io/projected/b8c57ee1-252e-4288-9af3-18e8bc12d143-kube-api-access-5cmhj\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.355164 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8c57ee1-252e-4288-9af3-18e8bc12d143-config-data\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.358550 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c57ee1-252e-4288-9af3-18e8bc12d143-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.368079 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmhj\" (UniqueName: \"kubernetes.io/projected/b8c57ee1-252e-4288-9af3-18e8bc12d143-kube-api-access-5cmhj\") pod \"nova-scheduler-0\" (UID: \"b8c57ee1-252e-4288-9af3-18e8bc12d143\") " pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.424415 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 30 21:41:20 crc kubenswrapper[4834]: I0130 21:41:20.960016 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 30 21:41:21 crc kubenswrapper[4834]: I0130 21:41:21.017058 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8c57ee1-252e-4288-9af3-18e8bc12d143","Type":"ContainerStarted","Data":"f7d57b5111aa5390977df6c88f9a8983a3f12f649619af057827bd66628b2474"} Jan 30 21:41:21 crc kubenswrapper[4834]: I0130 21:41:21.540574 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05abe6f8-7cb5-4f41-aa26-cb06d63e3c79" path="/var/lib/kubelet/pods/05abe6f8-7cb5-4f41-aa26-cb06d63e3c79/volumes" Jan 30 21:41:22 crc kubenswrapper[4834]: I0130 21:41:22.027632 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b8c57ee1-252e-4288-9af3-18e8bc12d143","Type":"ContainerStarted","Data":"337ba3aa5f0b16bd94153e7089c125c914a0c7a7d04b9ad704e17df279a3940f"} Jan 30 21:41:22 crc kubenswrapper[4834]: I0130 21:41:22.059982 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.0599648249999998 podStartE2EDuration="2.059964825s" podCreationTimestamp="2026-01-30 21:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:41:22.047850283 +0000 UTC m=+1533.200996431" watchObservedRunningTime="2026-01-30 21:41:22.059964825 +0000 UTC m=+1533.213110963" Jan 30 21:41:25 crc kubenswrapper[4834]: I0130 21:41:25.424597 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 30 21:41:30 crc kubenswrapper[4834]: I0130 21:41:30.424843 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 30 21:41:30 crc kubenswrapper[4834]: I0130 21:41:30.454135 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 30 21:41:31 crc kubenswrapper[4834]: I0130 21:41:31.174580 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 30 21:41:34 crc kubenswrapper[4834]: I0130 21:41:34.161042 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:41:34 crc kubenswrapper[4834]: I0130 21:41:34.161311 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:41:39 crc kubenswrapper[4834]: I0130 21:41:39.590095 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:41:40 crc kubenswrapper[4834]: I0130 21:41:40.254279 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:41:44 crc kubenswrapper[4834]: I0130 21:41:44.576901 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="rabbitmq" containerID="cri-o://a2f57942c10e77446bfc01dc91bd3dc17789a4321895b0d223014e4622a2f767" gracePeriod=604796 Jan 30 21:41:44 crc kubenswrapper[4834]: I0130 21:41:44.698863 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="rabbitmq" containerID="cri-o://1dfc18f738a42fea16bef16f66c063db2e73e7bc2d4fcf8e74f75bdef09ebcc8" gracePeriod=604795 Jan 30 21:41:49 crc kubenswrapper[4834]: I0130 21:41:49.458457 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.112:5671: connect: connection refused" Jan 30 21:41:49 crc kubenswrapper[4834]: I0130 21:41:49.783105 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.113:5671: connect: connection refused" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.339647 4834 generic.go:334] "Generic (PLEG): container finished" podID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerID="a2f57942c10e77446bfc01dc91bd3dc17789a4321895b0d223014e4622a2f767" exitCode=0 Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.339781 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"500f2414-6837-49ac-b834-06b5fd86d2b8","Type":"ContainerDied","Data":"a2f57942c10e77446bfc01dc91bd3dc17789a4321895b0d223014e4622a2f767"} Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.340009 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"500f2414-6837-49ac-b834-06b5fd86d2b8","Type":"ContainerDied","Data":"a5cfeed94bca20704ba339b4eb2a60ca5740a01ca6c4d7c5e22223c10f191049"} Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.340037 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5cfeed94bca20704ba339b4eb2a60ca5740a01ca6c4d7c5e22223c10f191049" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.342168 4834 generic.go:334] "Generic (PLEG): container finished" podID="bd8c97eb-154c-451c-88ec-025f6148936c" containerID="1dfc18f738a42fea16bef16f66c063db2e73e7bc2d4fcf8e74f75bdef09ebcc8" exitCode=0 Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.342254 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bd8c97eb-154c-451c-88ec-025f6148936c","Type":"ContainerDied","Data":"1dfc18f738a42fea16bef16f66c063db2e73e7bc2d4fcf8e74f75bdef09ebcc8"} Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.342291 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bd8c97eb-154c-451c-88ec-025f6148936c","Type":"ContainerDied","Data":"489f056fc942ecca987917eb48d929baafa4c81ae36d64b4c3837ca2c2417de1"} Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.342307 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489f056fc942ecca987917eb48d929baafa4c81ae36d64b4c3837ca2c2417de1" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.368419 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.375600 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465102 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-erlang-cookie\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465157 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-tls\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465191 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-confd\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465212 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-plugins\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465252 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmdxk\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-kube-api-access-gmdxk\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465276 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-tls\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465332 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-config-data\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465360 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-plugins-conf\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465433 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465493 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd8c97eb-154c-451c-88ec-025f6148936c-erlang-cookie-secret\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465522 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500f2414-6837-49ac-b834-06b5fd86d2b8-erlang-cookie-secret\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465558 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-confd\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465604 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-erlang-cookie\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465642 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465678 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-config-data\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465713 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465765 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465827 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-plugins\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465855 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-plugins-conf\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465884 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd8c97eb-154c-451c-88ec-025f6148936c-pod-info\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465935 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sb9g\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-kube-api-access-8sb9g\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.465978 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500f2414-6837-49ac-b834-06b5fd86d2b8-pod-info\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.466505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.467084 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.471110 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.473045 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.475634 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.476628 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.476715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-kube-api-access-gmdxk" (OuterVolumeSpecName: "kube-api-access-gmdxk") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "kube-api-access-gmdxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.476713 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bd8c97eb-154c-451c-88ec-025f6148936c-pod-info" (OuterVolumeSpecName: "pod-info") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.476762 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.476802 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8c97eb-154c-451c-88ec-025f6148936c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.477098 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.477633 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.479906 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.481641 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-kube-api-access-8sb9g" (OuterVolumeSpecName: "kube-api-access-8sb9g") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "kube-api-access-8sb9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.492300 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500f2414-6837-49ac-b834-06b5fd86d2b8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.492531 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/500f2414-6837-49ac-b834-06b5fd86d2b8-pod-info" (OuterVolumeSpecName: "pod-info") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.505632 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-config-data" (OuterVolumeSpecName: "config-data") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.518523 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-config-data" (OuterVolumeSpecName: "config-data") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.567441 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.568584 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.571540 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf\") pod \"500f2414-6837-49ac-b834-06b5fd86d2b8\" (UID: \"500f2414-6837-49ac-b834-06b5fd86d2b8\") " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.571647 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf\") pod \"bd8c97eb-154c-451c-88ec-025f6148936c\" (UID: \"bd8c97eb-154c-451c-88ec-025f6148936c\") " Jan 30 21:41:51 crc kubenswrapper[4834]: W0130 21:41:51.571900 4834 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/500f2414-6837-49ac-b834-06b5fd86d2b8/volumes/kubernetes.io~configmap/server-conf Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.571924 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: W0130 21:41:51.572052 4834 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bd8c97eb-154c-451c-88ec-025f6148936c/volumes/kubernetes.io~configmap/server-conf Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572063 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572419 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572445 4834 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd8c97eb-154c-451c-88ec-025f6148936c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572462 4834 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/500f2414-6837-49ac-b834-06b5fd86d2b8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572474 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572485 4834 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572495 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572512 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572521 4834 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-server-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.572531 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.573231 4834 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd8c97eb-154c-451c-88ec-025f6148936c-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578504 4834 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd8c97eb-154c-451c-88ec-025f6148936c-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578542 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sb9g\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-kube-api-access-8sb9g\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578558 4834 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/500f2414-6837-49ac-b834-06b5fd86d2b8-pod-info\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578572 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578583 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578594 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578606 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmdxk\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-kube-api-access-gmdxk\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578617 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578629 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.578640 4834 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/500f2414-6837-49ac-b834-06b5fd86d2b8-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.619249 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.619639 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.638105 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bd8c97eb-154c-451c-88ec-025f6148936c" (UID: "bd8c97eb-154c-451c-88ec-025f6148936c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.638739 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "500f2414-6837-49ac-b834-06b5fd86d2b8" (UID: "500f2414-6837-49ac-b834-06b5fd86d2b8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.681009 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/500f2414-6837-49ac-b834-06b5fd86d2b8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.681054 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.681067 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd8c97eb-154c-451c-88ec-025f6148936c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:51 crc kubenswrapper[4834]: I0130 21:41:51.681079 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.352543 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.352593 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.396304 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.407624 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.424851 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.432788 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.443353 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: E0130 21:41:52.443950 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="rabbitmq" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.443979 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="rabbitmq" Jan 30 21:41:52 crc kubenswrapper[4834]: E0130 21:41:52.444002 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="setup-container" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.444038 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="setup-container" Jan 30 21:41:52 crc kubenswrapper[4834]: E0130 21:41:52.444072 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="setup-container" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.444084 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="setup-container" Jan 30 21:41:52 crc kubenswrapper[4834]: E0130 21:41:52.444102 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="rabbitmq" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.444112 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="rabbitmq" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.444428 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" containerName="rabbitmq" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.444457 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" containerName="rabbitmq" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.446095 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.448522 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.448530 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.449218 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.449728 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.450298 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.450895 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-58hcq" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.450900 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.458932 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.513362 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.515046 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.519901 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.520195 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zw5br" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.520356 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.520601 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.520753 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.520878 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.521224 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.565106 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619779 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619813 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619840 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/927e5578-7c09-4caf-ab81-0e8229f8aef0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619870 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619900 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619923 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.619973 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620002 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620020 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/927e5578-7c09-4caf-ab81-0e8229f8aef0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620053 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620145 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620191 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620247 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7kn\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-kube-api-access-nd7kn\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620266 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620286 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9b522ed-a619-4a0c-99dd-9f14c679b469-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v24c\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-kube-api-access-9v24c\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620314 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-config-data\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620346 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620362 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9b522ed-a619-4a0c-99dd-9f14c679b469-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.620377 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.721740 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.721800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.721834 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/927e5578-7c09-4caf-ab81-0e8229f8aef0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.721874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.721912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.721951 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722011 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722087 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/927e5578-7c09-4caf-ab81-0e8229f8aef0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722136 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722172 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722200 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722262 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd7kn\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-kube-api-access-nd7kn\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722281 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722305 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9b522ed-a619-4a0c-99dd-9f14c679b469-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v24c\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-kube-api-access-9v24c\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722345 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-config-data\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722388 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722430 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9b522ed-a619-4a0c-99dd-9f14c679b469-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722491 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.722699 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.726388 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.726844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.727858 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.728224 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.728707 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.729562 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.729762 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-config-data\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.729913 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/927e5578-7c09-4caf-ab81-0e8229f8aef0-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.729948 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.730154 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.730546 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.730546 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/927e5578-7c09-4caf-ab81-0e8229f8aef0-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.730693 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9b522ed-a619-4a0c-99dd-9f14c679b469-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.731042 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9b522ed-a619-4a0c-99dd-9f14c679b469-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.731234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/927e5578-7c09-4caf-ab81-0e8229f8aef0-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.731695 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9b522ed-a619-4a0c-99dd-9f14c679b469-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.733521 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.733726 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.738185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9b522ed-a619-4a0c-99dd-9f14c679b469-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.748744 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v24c\" (UniqueName: \"kubernetes.io/projected/e9b522ed-a619-4a0c-99dd-9f14c679b469-kube-api-access-9v24c\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.753460 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd7kn\" (UniqueName: \"kubernetes.io/projected/927e5578-7c09-4caf-ab81-0e8229f8aef0-kube-api-access-nd7kn\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.770808 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"927e5578-7c09-4caf-ab81-0e8229f8aef0\") " pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.801002 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"e9b522ed-a619-4a0c-99dd-9f14c679b469\") " pod="openstack/rabbitmq-server-0" Jan 30 21:41:52 crc kubenswrapper[4834]: I0130 21:41:52.848820 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.065328 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.309357 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-rdk4z"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.311518 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.314907 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.328615 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-rdk4z"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.362149 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.442011 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-rdk4z"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443462 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-config\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: E0130 21:41:53.443690 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-2mshw openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" podUID="2df8decd-481b-464e-aeb2-e92fbbe19d14" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443754 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443810 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443837 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-svc\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443861 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443932 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.443959 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mshw\" (UniqueName: \"kubernetes.io/projected/2df8decd-481b-464e-aeb2-e92fbbe19d14-kube-api-access-2mshw\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.476049 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-sm2xp"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.478014 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.492194 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-sm2xp"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.541492 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500f2414-6837-49ac-b834-06b5fd86d2b8" path="/var/lib/kubelet/pods/500f2414-6837-49ac-b834-06b5fd86d2b8/volumes" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.542583 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8c97eb-154c-451c-88ec-025f6148936c" path="/var/lib/kubelet/pods/bd8c97eb-154c-451c-88ec-025f6148936c/volumes" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.545776 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.545839 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-svc\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.545914 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.545960 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546013 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546042 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mshw\" (UniqueName: \"kubernetes.io/projected/2df8decd-481b-464e-aeb2-e92fbbe19d14-kube-api-access-2mshw\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546101 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5ts\" (UniqueName: \"kubernetes.io/projected/626fd761-10dd-4cb5-9dbb-624ea3e5c525-kube-api-access-bs5ts\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546155 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546188 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546250 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-config\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-config\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546328 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546375 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546744 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-svc\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.546853 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.547122 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.547376 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-config\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.547448 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.561645 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mshw\" (UniqueName: \"kubernetes.io/projected/2df8decd-481b-464e-aeb2-e92fbbe19d14-kube-api-access-2mshw\") pod \"dnsmasq-dns-5576978c7c-rdk4z\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.592654 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648378 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648473 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648496 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5ts\" (UniqueName: \"kubernetes.io/projected/626fd761-10dd-4cb5-9dbb-624ea3e5c525-kube-api-access-bs5ts\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648543 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648612 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-config\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.648675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.650010 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.650362 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.650603 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-config\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.651625 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.651692 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.652138 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626fd761-10dd-4cb5-9dbb-624ea3e5c525-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.669378 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5ts\" (UniqueName: \"kubernetes.io/projected/626fd761-10dd-4cb5-9dbb-624ea3e5c525-kube-api-access-bs5ts\") pod \"dnsmasq-dns-8c6f6df99-sm2xp\" (UID: \"626fd761-10dd-4cb5-9dbb-624ea3e5c525\") " pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:53 crc kubenswrapper[4834]: I0130 21:41:53.803234 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.121288 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-sm2xp"] Jan 30 21:41:54 crc kubenswrapper[4834]: W0130 21:41:54.124250 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626fd761_10dd_4cb5_9dbb_624ea3e5c525.slice/crio-261b3f0de0a173095b8be93aaf77afadf3297fd3c83589cdcf4269ddde63f398 WatchSource:0}: Error finding container 261b3f0de0a173095b8be93aaf77afadf3297fd3c83589cdcf4269ddde63f398: Status 404 returned error can't find the container with id 261b3f0de0a173095b8be93aaf77afadf3297fd3c83589cdcf4269ddde63f398 Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.420880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" event={"ID":"626fd761-10dd-4cb5-9dbb-624ea3e5c525","Type":"ContainerStarted","Data":"261b3f0de0a173095b8be93aaf77afadf3297fd3c83589cdcf4269ddde63f398"} Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.428586 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9b522ed-a619-4a0c-99dd-9f14c679b469","Type":"ContainerStarted","Data":"67e77fb49ac83c727aaeffaba3dd5eca4cfe709bd01e453f555d7ec8fe625f7a"} Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.429565 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.430205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"927e5578-7c09-4caf-ab81-0e8229f8aef0","Type":"ContainerStarted","Data":"0f797fc01e3f4d102c7bfc8dd50ae2adb1e86873c3d91071d865cde3158a8ceb"} Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.444044 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.567471 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-swift-storage-0\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.567769 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mshw\" (UniqueName: \"kubernetes.io/projected/2df8decd-481b-464e-aeb2-e92fbbe19d14-kube-api-access-2mshw\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.567885 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-svc\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568065 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-sb\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568550 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-config\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568928 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-nb\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.569264 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-openstack-edpm-ipam\") pod \"2df8decd-481b-464e-aeb2-e92fbbe19d14\" (UID: \"2df8decd-481b-464e-aeb2-e92fbbe19d14\") " Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568114 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568330 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568460 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.568867 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-config" (OuterVolumeSpecName: "config") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.569222 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.570243 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.598476 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df8decd-481b-464e-aeb2-e92fbbe19d14-kube-api-access-2mshw" (OuterVolumeSpecName: "kube-api-access-2mshw") pod "2df8decd-481b-464e-aeb2-e92fbbe19d14" (UID: "2df8decd-481b-464e-aeb2-e92fbbe19d14"). InnerVolumeSpecName "kube-api-access-2mshw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.671908 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.671951 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.671962 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.671973 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mshw\" (UniqueName: \"kubernetes.io/projected/2df8decd-481b-464e-aeb2-e92fbbe19d14-kube-api-access-2mshw\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.671985 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.672027 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:54 crc kubenswrapper[4834]: I0130 21:41:54.672036 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2df8decd-481b-464e-aeb2-e92fbbe19d14-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:41:55 crc kubenswrapper[4834]: I0130 21:41:55.441828 4834 generic.go:334] "Generic (PLEG): container finished" podID="626fd761-10dd-4cb5-9dbb-624ea3e5c525" containerID="eaa419e6d123e0be835553575489df721046906e2966b285c30d0bfbceb38703" exitCode=0 Jan 30 21:41:55 crc kubenswrapper[4834]: I0130 21:41:55.441990 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" event={"ID":"626fd761-10dd-4cb5-9dbb-624ea3e5c525","Type":"ContainerDied","Data":"eaa419e6d123e0be835553575489df721046906e2966b285c30d0bfbceb38703"} Jan 30 21:41:55 crc kubenswrapper[4834]: I0130 21:41:55.446208 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:41:55 crc kubenswrapper[4834]: I0130 21:41:55.446727 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9b522ed-a619-4a0c-99dd-9f14c679b469","Type":"ContainerStarted","Data":"3746a0e174cda46d9953b43e9c665fbc8778feac44018b6cea522db538dbe419"} Jan 30 21:41:56 crc kubenswrapper[4834]: I0130 21:41:56.459751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" event={"ID":"626fd761-10dd-4cb5-9dbb-624ea3e5c525","Type":"ContainerStarted","Data":"c12673bf71d48c630fcc6f84f5040798bd042746d0ef5414f7081d0374a62cde"} Jan 30 21:41:56 crc kubenswrapper[4834]: I0130 21:41:56.460257 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:41:56 crc kubenswrapper[4834]: I0130 21:41:56.463133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"927e5578-7c09-4caf-ab81-0e8229f8aef0","Type":"ContainerStarted","Data":"62b011c081a1f9704c12b0bcf176c6491a9fece6f01e7ff784f24cb7dab459d1"} Jan 30 21:41:56 crc kubenswrapper[4834]: I0130 21:41:56.503315 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" podStartSLOduration=3.5032959999999997 podStartE2EDuration="3.503296s" podCreationTimestamp="2026-01-30 21:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:41:56.493162214 +0000 UTC m=+1567.646308382" watchObservedRunningTime="2026-01-30 21:41:56.503296 +0000 UTC m=+1567.656442148" Jan 30 21:42:03 crc kubenswrapper[4834]: I0130 21:42:03.805603 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-sm2xp" Jan 30 21:42:03 crc kubenswrapper[4834]: I0130 21:42:03.891916 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-t2r4g"] Jan 30 21:42:03 crc kubenswrapper[4834]: I0130 21:42:03.892249 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerName="dnsmasq-dns" containerID="cri-o://7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4" gracePeriod=10 Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.160905 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.161196 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.161247 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.162177 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.162225 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" gracePeriod=600 Jan 30 21:42:04 crc kubenswrapper[4834]: E0130 21:42:04.288604 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.496128 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.564799 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" exitCode=0 Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.564898 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a"} Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.564946 4834 scope.go:117] "RemoveContainer" containerID="8384069132f18eea6ac87d501b64935494bfc35764a079472903e1922c44d982" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.565887 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:42:04 crc kubenswrapper[4834]: E0130 21:42:04.566334 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.573750 4834 generic.go:334] "Generic (PLEG): container finished" podID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerID="7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4" exitCode=0 Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.573804 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" event={"ID":"89a4de72-8a0e-4fac-a03e-d01ed420df81","Type":"ContainerDied","Data":"7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4"} Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.573839 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" event={"ID":"89a4de72-8a0e-4fac-a03e-d01ed420df81","Type":"ContainerDied","Data":"9849eee6d9b839afc52f61b638779f9cf814e887daa2624043ca4e0be0c9a6cf"} Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.573912 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-t2r4g" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.625609 4834 scope.go:117] "RemoveContainer" containerID="7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.626311 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-config\") pod \"89a4de72-8a0e-4fac-a03e-d01ed420df81\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.626338 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-svc\") pod \"89a4de72-8a0e-4fac-a03e-d01ed420df81\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.626369 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-sb\") pod \"89a4de72-8a0e-4fac-a03e-d01ed420df81\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.626426 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msvnd\" (UniqueName: \"kubernetes.io/projected/89a4de72-8a0e-4fac-a03e-d01ed420df81-kube-api-access-msvnd\") pod \"89a4de72-8a0e-4fac-a03e-d01ed420df81\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.626774 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-nb\") pod \"89a4de72-8a0e-4fac-a03e-d01ed420df81\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.626822 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-swift-storage-0\") pod \"89a4de72-8a0e-4fac-a03e-d01ed420df81\" (UID: \"89a4de72-8a0e-4fac-a03e-d01ed420df81\") " Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.642885 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a4de72-8a0e-4fac-a03e-d01ed420df81-kube-api-access-msvnd" (OuterVolumeSpecName: "kube-api-access-msvnd") pod "89a4de72-8a0e-4fac-a03e-d01ed420df81" (UID: "89a4de72-8a0e-4fac-a03e-d01ed420df81"). InnerVolumeSpecName "kube-api-access-msvnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.685894 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89a4de72-8a0e-4fac-a03e-d01ed420df81" (UID: "89a4de72-8a0e-4fac-a03e-d01ed420df81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.687831 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-config" (OuterVolumeSpecName: "config") pod "89a4de72-8a0e-4fac-a03e-d01ed420df81" (UID: "89a4de72-8a0e-4fac-a03e-d01ed420df81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.690368 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89a4de72-8a0e-4fac-a03e-d01ed420df81" (UID: "89a4de72-8a0e-4fac-a03e-d01ed420df81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.690379 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89a4de72-8a0e-4fac-a03e-d01ed420df81" (UID: "89a4de72-8a0e-4fac-a03e-d01ed420df81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.707091 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89a4de72-8a0e-4fac-a03e-d01ed420df81" (UID: "89a4de72-8a0e-4fac-a03e-d01ed420df81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.723607 4834 scope.go:117] "RemoveContainer" containerID="ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.730474 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.730565 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.730595 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.730630 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-config\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.730643 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89a4de72-8a0e-4fac-a03e-d01ed420df81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.730654 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msvnd\" (UniqueName: \"kubernetes.io/projected/89a4de72-8a0e-4fac-a03e-d01ed420df81-kube-api-access-msvnd\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.744980 4834 scope.go:117] "RemoveContainer" containerID="7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4" Jan 30 21:42:04 crc kubenswrapper[4834]: E0130 21:42:04.745533 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4\": container with ID starting with 7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4 not found: ID does not exist" containerID="7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.745736 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4"} err="failed to get container status \"7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4\": rpc error: code = NotFound desc = could not find container \"7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4\": container with ID starting with 7306665ec144cdd865b146477d87e34aa4a124a1137b76f2999a9e2065dad0c4 not found: ID does not exist" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.745857 4834 scope.go:117] "RemoveContainer" containerID="ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9" Jan 30 21:42:04 crc kubenswrapper[4834]: E0130 21:42:04.746253 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9\": container with ID starting with ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9 not found: ID does not exist" containerID="ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.746306 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9"} err="failed to get container status \"ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9\": rpc error: code = NotFound desc = could not find container \"ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9\": container with ID starting with ba9314cc15bc6a5bac45cda981efa7f648749368f719bad45b34b9739580b8c9 not found: ID does not exist" Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.918012 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-t2r4g"] Jan 30 21:42:04 crc kubenswrapper[4834]: I0130 21:42:04.931604 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-t2r4g"] Jan 30 21:42:05 crc kubenswrapper[4834]: I0130 21:42:05.545533 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" path="/var/lib/kubelet/pods/89a4de72-8a0e-4fac-a03e-d01ed420df81/volumes" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.521588 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch"] Jan 30 21:42:12 crc kubenswrapper[4834]: E0130 21:42:12.522688 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerName="dnsmasq-dns" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.522707 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerName="dnsmasq-dns" Jan 30 21:42:12 crc kubenswrapper[4834]: E0130 21:42:12.522724 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerName="init" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.522732 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerName="init" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.522982 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a4de72-8a0e-4fac-a03e-d01ed420df81" containerName="dnsmasq-dns" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.523874 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.536804 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.537094 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.537227 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.537625 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.545745 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch"] Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.613890 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c769k\" (UniqueName: \"kubernetes.io/projected/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-kube-api-access-c769k\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.613995 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.614449 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.614509 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.716359 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.716475 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.716586 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c769k\" (UniqueName: \"kubernetes.io/projected/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-kube-api-access-c769k\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.716639 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.721492 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.721911 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.723435 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.733014 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c769k\" (UniqueName: \"kubernetes.io/projected/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-kube-api-access-c769k\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:12 crc kubenswrapper[4834]: I0130 21:42:12.863003 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:13 crc kubenswrapper[4834]: I0130 21:42:13.457309 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch"] Jan 30 21:42:13 crc kubenswrapper[4834]: W0130 21:42:13.467600 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda461fd28_a3f9_469a_b3b9_81abf54bf0c6.slice/crio-7f08276a4501ce8935f6fea607f50747899c4418c654b41c9fba0432b64f75f9 WatchSource:0}: Error finding container 7f08276a4501ce8935f6fea607f50747899c4418c654b41c9fba0432b64f75f9: Status 404 returned error can't find the container with id 7f08276a4501ce8935f6fea607f50747899c4418c654b41c9fba0432b64f75f9 Jan 30 21:42:13 crc kubenswrapper[4834]: I0130 21:42:13.688935 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" event={"ID":"a461fd28-a3f9-469a-b3b9-81abf54bf0c6","Type":"ContainerStarted","Data":"7f08276a4501ce8935f6fea607f50747899c4418c654b41c9fba0432b64f75f9"} Jan 30 21:42:16 crc kubenswrapper[4834]: I0130 21:42:16.532379 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:42:16 crc kubenswrapper[4834]: E0130 21:42:16.533304 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:42:22 crc kubenswrapper[4834]: I0130 21:42:22.788671 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" event={"ID":"a461fd28-a3f9-469a-b3b9-81abf54bf0c6","Type":"ContainerStarted","Data":"0a56a1dc70b7964a251577324c62e7d3d2e342330ab5207f61b4e5dd53a82756"} Jan 30 21:42:22 crc kubenswrapper[4834]: I0130 21:42:22.821048 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" podStartSLOduration=2.172178285 podStartE2EDuration="10.821027931s" podCreationTimestamp="2026-01-30 21:42:12 +0000 UTC" firstStartedPulling="2026-01-30 21:42:13.469670413 +0000 UTC m=+1584.622816551" lastFinishedPulling="2026-01-30 21:42:22.118520029 +0000 UTC m=+1593.271666197" observedRunningTime="2026-01-30 21:42:22.810833574 +0000 UTC m=+1593.963979732" watchObservedRunningTime="2026-01-30 21:42:22.821027931 +0000 UTC m=+1593.974174089" Jan 30 21:42:25 crc kubenswrapper[4834]: I0130 21:42:25.649303 4834 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2df8decd-481b-464e-aeb2-e92fbbe19d14"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2df8decd-481b-464e-aeb2-e92fbbe19d14] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2df8decd_481b_464e_aeb2_e92fbbe19d14.slice" Jan 30 21:42:25 crc kubenswrapper[4834]: E0130 21:42:25.650085 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2df8decd-481b-464e-aeb2-e92fbbe19d14] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2df8decd-481b-464e-aeb2-e92fbbe19d14] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2df8decd_481b_464e_aeb2_e92fbbe19d14.slice" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" podUID="2df8decd-481b-464e-aeb2-e92fbbe19d14" Jan 30 21:42:25 crc kubenswrapper[4834]: I0130 21:42:25.825010 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-rdk4z" Jan 30 21:42:25 crc kubenswrapper[4834]: I0130 21:42:25.917007 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-rdk4z"] Jan 30 21:42:25 crc kubenswrapper[4834]: I0130 21:42:25.932001 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-rdk4z"] Jan 30 21:42:27 crc kubenswrapper[4834]: I0130 21:42:27.556180 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df8decd-481b-464e-aeb2-e92fbbe19d14" path="/var/lib/kubelet/pods/2df8decd-481b-464e-aeb2-e92fbbe19d14/volumes" Jan 30 21:42:28 crc kubenswrapper[4834]: I0130 21:42:28.531745 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:42:28 crc kubenswrapper[4834]: E0130 21:42:28.532570 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:42:28 crc kubenswrapper[4834]: I0130 21:42:28.860591 4834 generic.go:334] "Generic (PLEG): container finished" podID="927e5578-7c09-4caf-ab81-0e8229f8aef0" containerID="62b011c081a1f9704c12b0bcf176c6491a9fece6f01e7ff784f24cb7dab459d1" exitCode=0 Jan 30 21:42:28 crc kubenswrapper[4834]: I0130 21:42:28.860655 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"927e5578-7c09-4caf-ab81-0e8229f8aef0","Type":"ContainerDied","Data":"62b011c081a1f9704c12b0bcf176c6491a9fece6f01e7ff784f24cb7dab459d1"} Jan 30 21:42:28 crc kubenswrapper[4834]: I0130 21:42:28.863360 4834 generic.go:334] "Generic (PLEG): container finished" podID="e9b522ed-a619-4a0c-99dd-9f14c679b469" containerID="3746a0e174cda46d9953b43e9c665fbc8778feac44018b6cea522db538dbe419" exitCode=0 Jan 30 21:42:28 crc kubenswrapper[4834]: I0130 21:42:28.863437 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9b522ed-a619-4a0c-99dd-9f14c679b469","Type":"ContainerDied","Data":"3746a0e174cda46d9953b43e9c665fbc8778feac44018b6cea522db538dbe419"} Jan 30 21:42:29 crc kubenswrapper[4834]: I0130 21:42:29.878635 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9b522ed-a619-4a0c-99dd-9f14c679b469","Type":"ContainerStarted","Data":"0e00ed16aef547cc628b1febc71a8d4563a3102a4005d94e10eee2c9df12895e"} Jan 30 21:42:29 crc kubenswrapper[4834]: I0130 21:42:29.879187 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 30 21:42:29 crc kubenswrapper[4834]: I0130 21:42:29.880866 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"927e5578-7c09-4caf-ab81-0e8229f8aef0","Type":"ContainerStarted","Data":"9b979b61a1282cde6b82159200ea39159a4081065c55389fc1170acd20cb50ed"} Jan 30 21:42:29 crc kubenswrapper[4834]: I0130 21:42:29.881145 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:42:29 crc kubenswrapper[4834]: I0130 21:42:29.918620 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.918598116 podStartE2EDuration="37.918598116s" podCreationTimestamp="2026-01-30 21:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:42:29.913003368 +0000 UTC m=+1601.066149506" watchObservedRunningTime="2026-01-30 21:42:29.918598116 +0000 UTC m=+1601.071744264" Jan 30 21:42:29 crc kubenswrapper[4834]: I0130 21:42:29.944540 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.944517617 podStartE2EDuration="37.944517617s" podCreationTimestamp="2026-01-30 21:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 21:42:29.940184755 +0000 UTC m=+1601.093330913" watchObservedRunningTime="2026-01-30 21:42:29.944517617 +0000 UTC m=+1601.097663765" Jan 30 21:42:33 crc kubenswrapper[4834]: I0130 21:42:33.961240 4834 generic.go:334] "Generic (PLEG): container finished" podID="a461fd28-a3f9-469a-b3b9-81abf54bf0c6" containerID="0a56a1dc70b7964a251577324c62e7d3d2e342330ab5207f61b4e5dd53a82756" exitCode=0 Jan 30 21:42:33 crc kubenswrapper[4834]: I0130 21:42:33.967475 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" event={"ID":"a461fd28-a3f9-469a-b3b9-81abf54bf0c6","Type":"ContainerDied","Data":"0a56a1dc70b7964a251577324c62e7d3d2e342330ab5207f61b4e5dd53a82756"} Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.502307 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.550303 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-inventory\") pod \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.550707 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c769k\" (UniqueName: \"kubernetes.io/projected/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-kube-api-access-c769k\") pod \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.550730 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-ssh-key-openstack-edpm-ipam\") pod \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.550750 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-repo-setup-combined-ca-bundle\") pod \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\" (UID: \"a461fd28-a3f9-469a-b3b9-81abf54bf0c6\") " Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.558549 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a461fd28-a3f9-469a-b3b9-81abf54bf0c6" (UID: "a461fd28-a3f9-469a-b3b9-81abf54bf0c6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.558726 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-kube-api-access-c769k" (OuterVolumeSpecName: "kube-api-access-c769k") pod "a461fd28-a3f9-469a-b3b9-81abf54bf0c6" (UID: "a461fd28-a3f9-469a-b3b9-81abf54bf0c6"). InnerVolumeSpecName "kube-api-access-c769k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.587879 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a461fd28-a3f9-469a-b3b9-81abf54bf0c6" (UID: "a461fd28-a3f9-469a-b3b9-81abf54bf0c6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.593954 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-inventory" (OuterVolumeSpecName: "inventory") pod "a461fd28-a3f9-469a-b3b9-81abf54bf0c6" (UID: "a461fd28-a3f9-469a-b3b9-81abf54bf0c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.653602 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.653644 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c769k\" (UniqueName: \"kubernetes.io/projected/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-kube-api-access-c769k\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.653660 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:35 crc kubenswrapper[4834]: I0130 21:42:35.653673 4834 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a461fd28-a3f9-469a-b3b9-81abf54bf0c6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.011329 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" event={"ID":"a461fd28-a3f9-469a-b3b9-81abf54bf0c6","Type":"ContainerDied","Data":"7f08276a4501ce8935f6fea607f50747899c4418c654b41c9fba0432b64f75f9"} Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.011376 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f08276a4501ce8935f6fea607f50747899c4418c654b41c9fba0432b64f75f9" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.011455 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.094722 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs"] Jan 30 21:42:36 crc kubenswrapper[4834]: E0130 21:42:36.095262 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a461fd28-a3f9-469a-b3b9-81abf54bf0c6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.095287 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a461fd28-a3f9-469a-b3b9-81abf54bf0c6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.095564 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a461fd28-a3f9-469a-b3b9-81abf54bf0c6" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.096424 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.099059 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.099214 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.100177 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.100323 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.114217 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs"] Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.169680 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.169825 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.169916 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm9tf\" (UniqueName: \"kubernetes.io/projected/e9533467-64e4-405d-9086-94b32f633d20-kube-api-access-zm9tf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.272231 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm9tf\" (UniqueName: \"kubernetes.io/projected/e9533467-64e4-405d-9086-94b32f633d20-kube-api-access-zm9tf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.272372 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.272513 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.277729 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.279031 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.290883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm9tf\" (UniqueName: \"kubernetes.io/projected/e9533467-64e4-405d-9086-94b32f633d20-kube-api-access-zm9tf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xpqzs\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:36 crc kubenswrapper[4834]: I0130 21:42:36.453340 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:37 crc kubenswrapper[4834]: I0130 21:42:37.153049 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs"] Jan 30 21:42:38 crc kubenswrapper[4834]: I0130 21:42:38.031443 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" event={"ID":"e9533467-64e4-405d-9086-94b32f633d20","Type":"ContainerStarted","Data":"614c4cb78c1fcda5df04ecead16339e8cea44908ef1b2d46abadc8079e19cf9e"} Jan 30 21:42:38 crc kubenswrapper[4834]: I0130 21:42:38.031780 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" event={"ID":"e9533467-64e4-405d-9086-94b32f633d20","Type":"ContainerStarted","Data":"3e01f150ccc0612a71de8f4155b47584be562246ad9aeda3d7d33194b850fabf"} Jan 30 21:42:38 crc kubenswrapper[4834]: I0130 21:42:38.052794 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" podStartSLOduration=1.528930009 podStartE2EDuration="2.05277399s" podCreationTimestamp="2026-01-30 21:42:36 +0000 UTC" firstStartedPulling="2026-01-30 21:42:37.161605415 +0000 UTC m=+1608.314751573" lastFinishedPulling="2026-01-30 21:42:37.685449416 +0000 UTC m=+1608.838595554" observedRunningTime="2026-01-30 21:42:38.051983668 +0000 UTC m=+1609.205129806" watchObservedRunningTime="2026-01-30 21:42:38.05277399 +0000 UTC m=+1609.205920128" Jan 30 21:42:41 crc kubenswrapper[4834]: I0130 21:42:41.069843 4834 generic.go:334] "Generic (PLEG): container finished" podID="e9533467-64e4-405d-9086-94b32f633d20" containerID="614c4cb78c1fcda5df04ecead16339e8cea44908ef1b2d46abadc8079e19cf9e" exitCode=0 Jan 30 21:42:41 crc kubenswrapper[4834]: I0130 21:42:41.069906 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" event={"ID":"e9533467-64e4-405d-9086-94b32f633d20","Type":"ContainerDied","Data":"614c4cb78c1fcda5df04ecead16339e8cea44908ef1b2d46abadc8079e19cf9e"} Jan 30 21:42:41 crc kubenswrapper[4834]: I0130 21:42:41.531692 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:42:41 crc kubenswrapper[4834]: E0130 21:42:41.532297 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.584329 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.719652 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-inventory\") pod \"e9533467-64e4-405d-9086-94b32f633d20\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.719923 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-ssh-key-openstack-edpm-ipam\") pod \"e9533467-64e4-405d-9086-94b32f633d20\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.719988 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm9tf\" (UniqueName: \"kubernetes.io/projected/e9533467-64e4-405d-9086-94b32f633d20-kube-api-access-zm9tf\") pod \"e9533467-64e4-405d-9086-94b32f633d20\" (UID: \"e9533467-64e4-405d-9086-94b32f633d20\") " Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.738715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9533467-64e4-405d-9086-94b32f633d20-kube-api-access-zm9tf" (OuterVolumeSpecName: "kube-api-access-zm9tf") pod "e9533467-64e4-405d-9086-94b32f633d20" (UID: "e9533467-64e4-405d-9086-94b32f633d20"). InnerVolumeSpecName "kube-api-access-zm9tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.757267 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-inventory" (OuterVolumeSpecName: "inventory") pod "e9533467-64e4-405d-9086-94b32f633d20" (UID: "e9533467-64e4-405d-9086-94b32f633d20"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.773974 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9533467-64e4-405d-9086-94b32f633d20" (UID: "e9533467-64e4-405d-9086-94b32f633d20"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.822878 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm9tf\" (UniqueName: \"kubernetes.io/projected/e9533467-64e4-405d-9086-94b32f633d20-kube-api-access-zm9tf\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.822937 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.822960 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9533467-64e4-405d-9086-94b32f633d20-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:42:42 crc kubenswrapper[4834]: I0130 21:42:42.854527 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.072163 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.133499 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" event={"ID":"e9533467-64e4-405d-9086-94b32f633d20","Type":"ContainerDied","Data":"3e01f150ccc0612a71de8f4155b47584be562246ad9aeda3d7d33194b850fabf"} Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.133537 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e01f150ccc0612a71de8f4155b47584be562246ad9aeda3d7d33194b850fabf" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.133592 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xpqzs" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.246506 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk"] Jan 30 21:42:43 crc kubenswrapper[4834]: E0130 21:42:43.246894 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9533467-64e4-405d-9086-94b32f633d20" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.246910 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9533467-64e4-405d-9086-94b32f633d20" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.247117 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9533467-64e4-405d-9086-94b32f633d20" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.247782 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.250701 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.254816 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.255045 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.255167 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.282361 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk"] Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.336538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.336673 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.336709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.336809 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4cb\" (UniqueName: \"kubernetes.io/projected/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-kube-api-access-5c4cb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.438898 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4cb\" (UniqueName: \"kubernetes.io/projected/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-kube-api-access-5c4cb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.439012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.439077 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.439102 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.443588 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.444072 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.445377 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.457158 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4cb\" (UniqueName: \"kubernetes.io/projected/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-kube-api-access-5c4cb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:43 crc kubenswrapper[4834]: I0130 21:42:43.583122 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:42:44 crc kubenswrapper[4834]: W0130 21:42:44.103240 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d7c7096_9310_450a_8562_4aa5ee7c3b4d.slice/crio-2e3e17f2602922379f9bc3c1118394abe46a3d69060932ba4db71e55478555fe WatchSource:0}: Error finding container 2e3e17f2602922379f9bc3c1118394abe46a3d69060932ba4db71e55478555fe: Status 404 returned error can't find the container with id 2e3e17f2602922379f9bc3c1118394abe46a3d69060932ba4db71e55478555fe Jan 30 21:42:44 crc kubenswrapper[4834]: I0130 21:42:44.104677 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk"] Jan 30 21:42:44 crc kubenswrapper[4834]: I0130 21:42:44.144497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" event={"ID":"8d7c7096-9310-450a-8562-4aa5ee7c3b4d","Type":"ContainerStarted","Data":"2e3e17f2602922379f9bc3c1118394abe46a3d69060932ba4db71e55478555fe"} Jan 30 21:42:45 crc kubenswrapper[4834]: I0130 21:42:45.155087 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" event={"ID":"8d7c7096-9310-450a-8562-4aa5ee7c3b4d","Type":"ContainerStarted","Data":"24930cc6d348282cfe4e59af4f2ad979d0a053450deb4d8e895d7b760d7227c6"} Jan 30 21:42:45 crc kubenswrapper[4834]: I0130 21:42:45.183799 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" podStartSLOduration=1.7929278 podStartE2EDuration="2.183782569s" podCreationTimestamp="2026-01-30 21:42:43 +0000 UTC" firstStartedPulling="2026-01-30 21:42:44.105739771 +0000 UTC m=+1615.258885909" lastFinishedPulling="2026-01-30 21:42:44.49659453 +0000 UTC m=+1615.649740678" observedRunningTime="2026-01-30 21:42:45.176423482 +0000 UTC m=+1616.329569620" watchObservedRunningTime="2026-01-30 21:42:45.183782569 +0000 UTC m=+1616.336928707" Jan 30 21:42:55 crc kubenswrapper[4834]: I0130 21:42:55.531621 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:42:55 crc kubenswrapper[4834]: E0130 21:42:55.532698 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.336402 4834 scope.go:117] "RemoveContainer" containerID="ff8dc2bf48bde71be8dd7f9e99850cad657cbb62d67123905a28d5641c81cae9" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.395684 4834 scope.go:117] "RemoveContainer" containerID="b5f40a5302db9286d8889928391704f4344288840c59da33e134a6c141d1f7cc" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.425030 4834 scope.go:117] "RemoveContainer" containerID="8b6485eff1d46030a91f054fd46ed3228c039d232ba76177611ab9294bb8f54b" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.479028 4834 scope.go:117] "RemoveContainer" containerID="1dfc18f738a42fea16bef16f66c063db2e73e7bc2d4fcf8e74f75bdef09ebcc8" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.518259 4834 scope.go:117] "RemoveContainer" containerID="298b0d54929de73c138aeb7f470a15fc04c1aa6fa18b6a5212045046eec7737f" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.558146 4834 scope.go:117] "RemoveContainer" containerID="a2f57942c10e77446bfc01dc91bd3dc17789a4321895b0d223014e4622a2f767" Jan 30 21:42:57 crc kubenswrapper[4834]: I0130 21:42:57.606039 4834 scope.go:117] "RemoveContainer" containerID="30b429785ae07574307d2ee24ed2e4655ab0e90616b6dabdba9cd230fe9ee263" Jan 30 21:43:10 crc kubenswrapper[4834]: I0130 21:43:10.531115 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:43:10 crc kubenswrapper[4834]: E0130 21:43:10.532135 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:43:23 crc kubenswrapper[4834]: I0130 21:43:23.531883 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:43:23 crc kubenswrapper[4834]: E0130 21:43:23.533145 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:43:38 crc kubenswrapper[4834]: I0130 21:43:38.537772 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:43:38 crc kubenswrapper[4834]: E0130 21:43:38.538803 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:43:49 crc kubenswrapper[4834]: I0130 21:43:49.567118 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:43:49 crc kubenswrapper[4834]: E0130 21:43:49.567975 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:43:57 crc kubenswrapper[4834]: I0130 21:43:57.808557 4834 scope.go:117] "RemoveContainer" containerID="55d405e8e2ddbc9806064fff95bbf045eabbeda236a827c41132dbdcdcaa706f" Jan 30 21:44:04 crc kubenswrapper[4834]: I0130 21:44:04.532813 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:44:04 crc kubenswrapper[4834]: E0130 21:44:04.533904 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:44:19 crc kubenswrapper[4834]: I0130 21:44:19.539318 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:44:19 crc kubenswrapper[4834]: E0130 21:44:19.540139 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:44:31 crc kubenswrapper[4834]: I0130 21:44:31.532187 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:44:31 crc kubenswrapper[4834]: E0130 21:44:31.532959 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:44:46 crc kubenswrapper[4834]: I0130 21:44:46.531824 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:44:46 crc kubenswrapper[4834]: E0130 21:44:46.533174 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:44:57 crc kubenswrapper[4834]: I0130 21:44:57.912335 4834 scope.go:117] "RemoveContainer" containerID="e28acc9c1e4817836d5543db7117e4396c7112aa85f5570f6ade46093b2feee6" Jan 30 21:44:57 crc kubenswrapper[4834]: I0130 21:44:57.945161 4834 scope.go:117] "RemoveContainer" containerID="bb55cab1192d1792936202539ace31ed17430c9bd2639b4b38752b99acfc8892" Jan 30 21:44:57 crc kubenswrapper[4834]: I0130 21:44:57.966833 4834 scope.go:117] "RemoveContainer" containerID="c98617e6f9ebcea636edbe6c5ccb33beab7c9a970348b990280b8f7e1a05d437" Jan 30 21:44:57 crc kubenswrapper[4834]: I0130 21:44:57.995797 4834 scope.go:117] "RemoveContainer" containerID="d309fd4b63a2c82d319fbcbe70b6aa1db6c81af8c09994d42551c5f5434d0c5b" Jan 30 21:44:58 crc kubenswrapper[4834]: I0130 21:44:58.028045 4834 scope.go:117] "RemoveContainer" containerID="0486912439aa342c5a076ccb76600a914e085bc3c5d1e088bee3eb78712c9d29" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.161542 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm"] Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.163119 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.166017 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.166519 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.186426 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm"] Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.285493 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdvr5\" (UniqueName: \"kubernetes.io/projected/bce5783d-a2d0-4d28-8a2b-5919239a2124-kube-api-access-xdvr5\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.285798 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bce5783d-a2d0-4d28-8a2b-5919239a2124-config-volume\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.285952 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bce5783d-a2d0-4d28-8a2b-5919239a2124-secret-volume\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.387659 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bce5783d-a2d0-4d28-8a2b-5919239a2124-secret-volume\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.387785 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvr5\" (UniqueName: \"kubernetes.io/projected/bce5783d-a2d0-4d28-8a2b-5919239a2124-kube-api-access-xdvr5\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.387821 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bce5783d-a2d0-4d28-8a2b-5919239a2124-config-volume\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.389181 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bce5783d-a2d0-4d28-8a2b-5919239a2124-config-volume\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.393249 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bce5783d-a2d0-4d28-8a2b-5919239a2124-secret-volume\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.409516 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdvr5\" (UniqueName: \"kubernetes.io/projected/bce5783d-a2d0-4d28-8a2b-5919239a2124-kube-api-access-xdvr5\") pod \"collect-profiles-29496825-j4znm\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.484172 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:00 crc kubenswrapper[4834]: I0130 21:45:00.964987 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm"] Jan 30 21:45:01 crc kubenswrapper[4834]: I0130 21:45:01.532903 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:45:01 crc kubenswrapper[4834]: E0130 21:45:01.533117 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:45:01 crc kubenswrapper[4834]: I0130 21:45:01.775240 4834 generic.go:334] "Generic (PLEG): container finished" podID="bce5783d-a2d0-4d28-8a2b-5919239a2124" containerID="d530e8dd9adf41d9814f630f1cf56442197382b0bc931763596fea279c52bc3b" exitCode=0 Jan 30 21:45:01 crc kubenswrapper[4834]: I0130 21:45:01.775427 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" event={"ID":"bce5783d-a2d0-4d28-8a2b-5919239a2124","Type":"ContainerDied","Data":"d530e8dd9adf41d9814f630f1cf56442197382b0bc931763596fea279c52bc3b"} Jan 30 21:45:01 crc kubenswrapper[4834]: I0130 21:45:01.775578 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" event={"ID":"bce5783d-a2d0-4d28-8a2b-5919239a2124","Type":"ContainerStarted","Data":"732f3cff488e58c5ae90744af8d838da90d15036c4e637b7d0846fe691969e33"} Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.222055 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.252253 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdvr5\" (UniqueName: \"kubernetes.io/projected/bce5783d-a2d0-4d28-8a2b-5919239a2124-kube-api-access-xdvr5\") pod \"bce5783d-a2d0-4d28-8a2b-5919239a2124\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.252299 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bce5783d-a2d0-4d28-8a2b-5919239a2124-secret-volume\") pod \"bce5783d-a2d0-4d28-8a2b-5919239a2124\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.252321 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bce5783d-a2d0-4d28-8a2b-5919239a2124-config-volume\") pod \"bce5783d-a2d0-4d28-8a2b-5919239a2124\" (UID: \"bce5783d-a2d0-4d28-8a2b-5919239a2124\") " Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.253142 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce5783d-a2d0-4d28-8a2b-5919239a2124-config-volume" (OuterVolumeSpecName: "config-volume") pod "bce5783d-a2d0-4d28-8a2b-5919239a2124" (UID: "bce5783d-a2d0-4d28-8a2b-5919239a2124"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.260560 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce5783d-a2d0-4d28-8a2b-5919239a2124-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bce5783d-a2d0-4d28-8a2b-5919239a2124" (UID: "bce5783d-a2d0-4d28-8a2b-5919239a2124"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.265579 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce5783d-a2d0-4d28-8a2b-5919239a2124-kube-api-access-xdvr5" (OuterVolumeSpecName: "kube-api-access-xdvr5") pod "bce5783d-a2d0-4d28-8a2b-5919239a2124" (UID: "bce5783d-a2d0-4d28-8a2b-5919239a2124"). InnerVolumeSpecName "kube-api-access-xdvr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.354639 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bce5783d-a2d0-4d28-8a2b-5919239a2124-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.354667 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bce5783d-a2d0-4d28-8a2b-5919239a2124-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.354677 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdvr5\" (UniqueName: \"kubernetes.io/projected/bce5783d-a2d0-4d28-8a2b-5919239a2124-kube-api-access-xdvr5\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.797136 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" event={"ID":"bce5783d-a2d0-4d28-8a2b-5919239a2124","Type":"ContainerDied","Data":"732f3cff488e58c5ae90744af8d838da90d15036c4e637b7d0846fe691969e33"} Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.797172 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="732f3cff488e58c5ae90744af8d838da90d15036c4e637b7d0846fe691969e33" Jan 30 21:45:03 crc kubenswrapper[4834]: I0130 21:45:03.797220 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496825-j4znm" Jan 30 21:45:13 crc kubenswrapper[4834]: I0130 21:45:13.531896 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:45:13 crc kubenswrapper[4834]: E0130 21:45:13.532709 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.497286 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xn27b"] Jan 30 21:45:24 crc kubenswrapper[4834]: E0130 21:45:24.498689 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce5783d-a2d0-4d28-8a2b-5919239a2124" containerName="collect-profiles" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.498710 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce5783d-a2d0-4d28-8a2b-5919239a2124" containerName="collect-profiles" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.498979 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce5783d-a2d0-4d28-8a2b-5919239a2124" containerName="collect-profiles" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.500781 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.510008 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xn27b"] Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.538086 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:45:24 crc kubenswrapper[4834]: E0130 21:45:24.538806 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.642499 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6p2z\" (UniqueName: \"kubernetes.io/projected/dad33907-43bc-4799-987f-24fe76451564-kube-api-access-j6p2z\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.642663 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-catalog-content\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.642695 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-utilities\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.744864 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-catalog-content\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.744914 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-utilities\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.745021 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6p2z\" (UniqueName: \"kubernetes.io/projected/dad33907-43bc-4799-987f-24fe76451564-kube-api-access-j6p2z\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.745732 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-catalog-content\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.745950 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-utilities\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.765237 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6p2z\" (UniqueName: \"kubernetes.io/projected/dad33907-43bc-4799-987f-24fe76451564-kube-api-access-j6p2z\") pod \"community-operators-xn27b\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:24 crc kubenswrapper[4834]: I0130 21:45:24.854611 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:25 crc kubenswrapper[4834]: I0130 21:45:25.397276 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xn27b"] Jan 30 21:45:25 crc kubenswrapper[4834]: I0130 21:45:25.993890 4834 generic.go:334] "Generic (PLEG): container finished" podID="dad33907-43bc-4799-987f-24fe76451564" containerID="5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2" exitCode=0 Jan 30 21:45:25 crc kubenswrapper[4834]: I0130 21:45:25.993943 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerDied","Data":"5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2"} Jan 30 21:45:25 crc kubenswrapper[4834]: I0130 21:45:25.994181 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerStarted","Data":"23069720989eafcaec1c51d5f88c3f1866456df19ece41311dffcd305082a48d"} Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:25.996172 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.701007 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5bc7f"] Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.703181 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.712512 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bc7f"] Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.890221 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdpr8\" (UniqueName: \"kubernetes.io/projected/92978c87-3d4e-448c-960d-cc37d8f37c93-kube-api-access-zdpr8\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.890652 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-catalog-content\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.890738 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-utilities\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.992806 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-catalog-content\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.992881 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-utilities\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.992981 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdpr8\" (UniqueName: \"kubernetes.io/projected/92978c87-3d4e-448c-960d-cc37d8f37c93-kube-api-access-zdpr8\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.994000 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-utilities\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:26 crc kubenswrapper[4834]: I0130 21:45:26.994082 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-catalog-content\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:27 crc kubenswrapper[4834]: I0130 21:45:27.016029 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdpr8\" (UniqueName: \"kubernetes.io/projected/92978c87-3d4e-448c-960d-cc37d8f37c93-kube-api-access-zdpr8\") pod \"redhat-marketplace-5bc7f\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:27 crc kubenswrapper[4834]: I0130 21:45:27.023480 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:27 crc kubenswrapper[4834]: I0130 21:45:27.675230 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bc7f"] Jan 30 21:45:28 crc kubenswrapper[4834]: I0130 21:45:28.019727 4834 generic.go:334] "Generic (PLEG): container finished" podID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerID="a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b" exitCode=0 Jan 30 21:45:28 crc kubenswrapper[4834]: I0130 21:45:28.019832 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bc7f" event={"ID":"92978c87-3d4e-448c-960d-cc37d8f37c93","Type":"ContainerDied","Data":"a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b"} Jan 30 21:45:28 crc kubenswrapper[4834]: I0130 21:45:28.019896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bc7f" event={"ID":"92978c87-3d4e-448c-960d-cc37d8f37c93","Type":"ContainerStarted","Data":"1f61270fe692dc9dcbf2c290cc3ef030689457811a3e56f8072d49b153e1ac8e"} Jan 30 21:45:28 crc kubenswrapper[4834]: I0130 21:45:28.022818 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerStarted","Data":"502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d"} Jan 30 21:45:32 crc kubenswrapper[4834]: E0130 21:45:32.744582 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92978c87_3d4e_448c_960d_cc37d8f37c93.slice/crio-conmon-670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d.scope\": RecentStats: unable to find data in memory cache]" Jan 30 21:45:33 crc kubenswrapper[4834]: I0130 21:45:33.080501 4834 generic.go:334] "Generic (PLEG): container finished" podID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerID="670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d" exitCode=0 Jan 30 21:45:33 crc kubenswrapper[4834]: I0130 21:45:33.080543 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bc7f" event={"ID":"92978c87-3d4e-448c-960d-cc37d8f37c93","Type":"ContainerDied","Data":"670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d"} Jan 30 21:45:35 crc kubenswrapper[4834]: I0130 21:45:35.108043 4834 generic.go:334] "Generic (PLEG): container finished" podID="dad33907-43bc-4799-987f-24fe76451564" containerID="502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d" exitCode=0 Jan 30 21:45:35 crc kubenswrapper[4834]: I0130 21:45:35.108106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerDied","Data":"502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d"} Jan 30 21:45:35 crc kubenswrapper[4834]: I0130 21:45:35.113715 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bc7f" event={"ID":"92978c87-3d4e-448c-960d-cc37d8f37c93","Type":"ContainerStarted","Data":"0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153"} Jan 30 21:45:35 crc kubenswrapper[4834]: I0130 21:45:35.169160 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5bc7f" podStartSLOduration=2.972230208 podStartE2EDuration="9.169144427s" podCreationTimestamp="2026-01-30 21:45:26 +0000 UTC" firstStartedPulling="2026-01-30 21:45:28.022081686 +0000 UTC m=+1779.175227834" lastFinishedPulling="2026-01-30 21:45:34.218995915 +0000 UTC m=+1785.372142053" observedRunningTime="2026-01-30 21:45:35.165261827 +0000 UTC m=+1786.318407975" watchObservedRunningTime="2026-01-30 21:45:35.169144427 +0000 UTC m=+1786.322290565" Jan 30 21:45:35 crc kubenswrapper[4834]: I0130 21:45:35.531159 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:45:35 crc kubenswrapper[4834]: E0130 21:45:35.531675 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:45:37 crc kubenswrapper[4834]: I0130 21:45:37.023593 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:37 crc kubenswrapper[4834]: I0130 21:45:37.024123 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:37 crc kubenswrapper[4834]: I0130 21:45:37.101619 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:37 crc kubenswrapper[4834]: I0130 21:45:37.134176 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerStarted","Data":"ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff"} Jan 30 21:45:37 crc kubenswrapper[4834]: I0130 21:45:37.153310 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xn27b" podStartSLOduration=2.571752934 podStartE2EDuration="13.153283414s" podCreationTimestamp="2026-01-30 21:45:24 +0000 UTC" firstStartedPulling="2026-01-30 21:45:25.995921665 +0000 UTC m=+1777.149067803" lastFinishedPulling="2026-01-30 21:45:36.577452125 +0000 UTC m=+1787.730598283" observedRunningTime="2026-01-30 21:45:37.152261906 +0000 UTC m=+1788.305408044" watchObservedRunningTime="2026-01-30 21:45:37.153283414 +0000 UTC m=+1788.306429562" Jan 30 21:45:44 crc kubenswrapper[4834]: I0130 21:45:44.855758 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:44 crc kubenswrapper[4834]: I0130 21:45:44.856379 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:44 crc kubenswrapper[4834]: I0130 21:45:44.919664 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:45 crc kubenswrapper[4834]: I0130 21:45:45.272723 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:45 crc kubenswrapper[4834]: I0130 21:45:45.335610 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xn27b"] Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.094450 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.233975 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xn27b" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="registry-server" containerID="cri-o://ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff" gracePeriod=2 Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.563418 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bc7f"] Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.563658 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5bc7f" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="registry-server" containerID="cri-o://0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153" gracePeriod=2 Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.818285 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.957638 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-catalog-content\") pod \"dad33907-43bc-4799-987f-24fe76451564\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.957699 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-utilities\") pod \"dad33907-43bc-4799-987f-24fe76451564\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.957847 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6p2z\" (UniqueName: \"kubernetes.io/projected/dad33907-43bc-4799-987f-24fe76451564-kube-api-access-j6p2z\") pod \"dad33907-43bc-4799-987f-24fe76451564\" (UID: \"dad33907-43bc-4799-987f-24fe76451564\") " Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.968103 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-utilities" (OuterVolumeSpecName: "utilities") pod "dad33907-43bc-4799-987f-24fe76451564" (UID: "dad33907-43bc-4799-987f-24fe76451564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:45:47 crc kubenswrapper[4834]: I0130 21:45:47.995052 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad33907-43bc-4799-987f-24fe76451564-kube-api-access-j6p2z" (OuterVolumeSpecName: "kube-api-access-j6p2z") pod "dad33907-43bc-4799-987f-24fe76451564" (UID: "dad33907-43bc-4799-987f-24fe76451564"). InnerVolumeSpecName "kube-api-access-j6p2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.069068 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6p2z\" (UniqueName: \"kubernetes.io/projected/dad33907-43bc-4799-987f-24fe76451564-kube-api-access-j6p2z\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.069105 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.072410 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dad33907-43bc-4799-987f-24fe76451564" (UID: "dad33907-43bc-4799-987f-24fe76451564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.112314 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.171174 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdpr8\" (UniqueName: \"kubernetes.io/projected/92978c87-3d4e-448c-960d-cc37d8f37c93-kube-api-access-zdpr8\") pod \"92978c87-3d4e-448c-960d-cc37d8f37c93\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.171606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-catalog-content\") pod \"92978c87-3d4e-448c-960d-cc37d8f37c93\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.171651 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-utilities\") pod \"92978c87-3d4e-448c-960d-cc37d8f37c93\" (UID: \"92978c87-3d4e-448c-960d-cc37d8f37c93\") " Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.172104 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dad33907-43bc-4799-987f-24fe76451564-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.172716 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-utilities" (OuterVolumeSpecName: "utilities") pod "92978c87-3d4e-448c-960d-cc37d8f37c93" (UID: "92978c87-3d4e-448c-960d-cc37d8f37c93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.182584 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92978c87-3d4e-448c-960d-cc37d8f37c93-kube-api-access-zdpr8" (OuterVolumeSpecName: "kube-api-access-zdpr8") pod "92978c87-3d4e-448c-960d-cc37d8f37c93" (UID: "92978c87-3d4e-448c-960d-cc37d8f37c93"). InnerVolumeSpecName "kube-api-access-zdpr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.196136 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92978c87-3d4e-448c-960d-cc37d8f37c93" (UID: "92978c87-3d4e-448c-960d-cc37d8f37c93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.245735 4834 generic.go:334] "Generic (PLEG): container finished" podID="dad33907-43bc-4799-987f-24fe76451564" containerID="ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff" exitCode=0 Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.245777 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerDied","Data":"ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff"} Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.245828 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xn27b" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.245846 4834 scope.go:117] "RemoveContainer" containerID="ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.245835 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xn27b" event={"ID":"dad33907-43bc-4799-987f-24fe76451564","Type":"ContainerDied","Data":"23069720989eafcaec1c51d5f88c3f1866456df19ece41311dffcd305082a48d"} Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.250050 4834 generic.go:334] "Generic (PLEG): container finished" podID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerID="0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153" exitCode=0 Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.250077 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bc7f" event={"ID":"92978c87-3d4e-448c-960d-cc37d8f37c93","Type":"ContainerDied","Data":"0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153"} Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.250104 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5bc7f" event={"ID":"92978c87-3d4e-448c-960d-cc37d8f37c93","Type":"ContainerDied","Data":"1f61270fe692dc9dcbf2c290cc3ef030689457811a3e56f8072d49b153e1ac8e"} Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.250153 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5bc7f" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.275034 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdpr8\" (UniqueName: \"kubernetes.io/projected/92978c87-3d4e-448c-960d-cc37d8f37c93-kube-api-access-zdpr8\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.275095 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.275108 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92978c87-3d4e-448c-960d-cc37d8f37c93-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.279651 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xn27b"] Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.285487 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xn27b"] Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.289182 4834 scope.go:117] "RemoveContainer" containerID="502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.300979 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bc7f"] Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.309301 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5bc7f"] Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.324302 4834 scope.go:117] "RemoveContainer" containerID="5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.345466 4834 scope.go:117] "RemoveContainer" containerID="ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.346324 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff\": container with ID starting with ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff not found: ID does not exist" containerID="ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.346361 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff"} err="failed to get container status \"ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff\": rpc error: code = NotFound desc = could not find container \"ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff\": container with ID starting with ae20391bc517f88720a10e6a1a56ad088191aa1922717053b2ba8b6c0eb41fff not found: ID does not exist" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.346382 4834 scope.go:117] "RemoveContainer" containerID="502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.346749 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d\": container with ID starting with 502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d not found: ID does not exist" containerID="502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.346766 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d"} err="failed to get container status \"502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d\": rpc error: code = NotFound desc = could not find container \"502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d\": container with ID starting with 502b09d81e9ff4cf97d5ef31f0afb7b4b27806a05d3ff62020dcd8319aa5584d not found: ID does not exist" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.346792 4834 scope.go:117] "RemoveContainer" containerID="5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.347297 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2\": container with ID starting with 5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2 not found: ID does not exist" containerID="5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.347368 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2"} err="failed to get container status \"5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2\": rpc error: code = NotFound desc = could not find container \"5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2\": container with ID starting with 5b55b5bf40eb7ba3675ef67fc1855371cb2a0979b7e4f50071e988dd1639e2c2 not found: ID does not exist" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.347431 4834 scope.go:117] "RemoveContainer" containerID="0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.414347 4834 scope.go:117] "RemoveContainer" containerID="670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.436116 4834 scope.go:117] "RemoveContainer" containerID="a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.475680 4834 scope.go:117] "RemoveContainer" containerID="0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.476090 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153\": container with ID starting with 0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153 not found: ID does not exist" containerID="0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.476130 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153"} err="failed to get container status \"0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153\": rpc error: code = NotFound desc = could not find container \"0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153\": container with ID starting with 0fe1135ace94bb8598cd1fc18bdf014a4e3598abd8d33094ad49974722df3153 not found: ID does not exist" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.476156 4834 scope.go:117] "RemoveContainer" containerID="670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.476525 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d\": container with ID starting with 670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d not found: ID does not exist" containerID="670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.476568 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d"} err="failed to get container status \"670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d\": rpc error: code = NotFound desc = could not find container \"670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d\": container with ID starting with 670ec3e57086c3a28ed2947d094e0950577149687f151f1631abb0b6ad5eff7d not found: ID does not exist" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.476600 4834 scope.go:117] "RemoveContainer" containerID="a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.476929 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b\": container with ID starting with a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b not found: ID does not exist" containerID="a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.476956 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b"} err="failed to get container status \"a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b\": rpc error: code = NotFound desc = could not find container \"a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b\": container with ID starting with a8fe40ad23a856feed0826fdcadcd359f20101ee8b05638b7e56096e4233dd9b not found: ID does not exist" Jan 30 21:45:48 crc kubenswrapper[4834]: I0130 21:45:48.531565 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:45:48 crc kubenswrapper[4834]: E0130 21:45:48.531984 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:45:49 crc kubenswrapper[4834]: I0130 21:45:49.565252 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" path="/var/lib/kubelet/pods/92978c87-3d4e-448c-960d-cc37d8f37c93/volumes" Jan 30 21:45:49 crc kubenswrapper[4834]: I0130 21:45:49.566894 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad33907-43bc-4799-987f-24fe76451564" path="/var/lib/kubelet/pods/dad33907-43bc-4799-987f-24fe76451564/volumes" Jan 30 21:45:55 crc kubenswrapper[4834]: I0130 21:45:55.356022 4834 generic.go:334] "Generic (PLEG): container finished" podID="8d7c7096-9310-450a-8562-4aa5ee7c3b4d" containerID="24930cc6d348282cfe4e59af4f2ad979d0a053450deb4d8e895d7b760d7227c6" exitCode=0 Jan 30 21:45:55 crc kubenswrapper[4834]: I0130 21:45:55.356151 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" event={"ID":"8d7c7096-9310-450a-8562-4aa5ee7c3b4d","Type":"ContainerDied","Data":"24930cc6d348282cfe4e59af4f2ad979d0a053450deb4d8e895d7b760d7227c6"} Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.867161 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.956439 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-ssh-key-openstack-edpm-ipam\") pod \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.956520 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c4cb\" (UniqueName: \"kubernetes.io/projected/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-kube-api-access-5c4cb\") pod \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.956747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-inventory\") pod \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.956793 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-bootstrap-combined-ca-bundle\") pod \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\" (UID: \"8d7c7096-9310-450a-8562-4aa5ee7c3b4d\") " Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.962406 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8d7c7096-9310-450a-8562-4aa5ee7c3b4d" (UID: "8d7c7096-9310-450a-8562-4aa5ee7c3b4d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.963666 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-kube-api-access-5c4cb" (OuterVolumeSpecName: "kube-api-access-5c4cb") pod "8d7c7096-9310-450a-8562-4aa5ee7c3b4d" (UID: "8d7c7096-9310-450a-8562-4aa5ee7c3b4d"). InnerVolumeSpecName "kube-api-access-5c4cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:45:56 crc kubenswrapper[4834]: I0130 21:45:56.994125 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-inventory" (OuterVolumeSpecName: "inventory") pod "8d7c7096-9310-450a-8562-4aa5ee7c3b4d" (UID: "8d7c7096-9310-450a-8562-4aa5ee7c3b4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.009219 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d7c7096-9310-450a-8562-4aa5ee7c3b4d" (UID: "8d7c7096-9310-450a-8562-4aa5ee7c3b4d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.060086 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.060125 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c4cb\" (UniqueName: \"kubernetes.io/projected/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-kube-api-access-5c4cb\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.060141 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.060153 4834 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d7c7096-9310-450a-8562-4aa5ee7c3b4d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.391153 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" event={"ID":"8d7c7096-9310-450a-8562-4aa5ee7c3b4d","Type":"ContainerDied","Data":"2e3e17f2602922379f9bc3c1118394abe46a3d69060932ba4db71e55478555fe"} Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.391198 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e3e17f2602922379f9bc3c1118394abe46a3d69060932ba4db71e55478555fe" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.391206 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.491615 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj"] Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.492585 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="registry-server" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.492604 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="registry-server" Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.492622 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="registry-server" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.492648 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="registry-server" Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.492663 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="extract-content" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.492671 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="extract-content" Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.492679 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="extract-utilities" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493048 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="extract-utilities" Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.493071 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d7c7096-9310-450a-8562-4aa5ee7c3b4d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493082 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d7c7096-9310-450a-8562-4aa5ee7c3b4d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.493097 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="extract-utilities" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493107 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="extract-utilities" Jan 30 21:45:57 crc kubenswrapper[4834]: E0130 21:45:57.493118 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="extract-content" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493126 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="extract-content" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493455 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="92978c87-3d4e-448c-960d-cc37d8f37c93" containerName="registry-server" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493475 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad33907-43bc-4799-987f-24fe76451564" containerName="registry-server" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.493489 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d7c7096-9310-450a-8562-4aa5ee7c3b4d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.494535 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.497483 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.497761 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.499528 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.499687 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.515613 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj"] Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.569094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplw6\" (UniqueName: \"kubernetes.io/projected/31eef1e3-8bc9-4b01-953b-ff70f5082420-kube-api-access-nplw6\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.569204 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.569274 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.670165 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplw6\" (UniqueName: \"kubernetes.io/projected/31eef1e3-8bc9-4b01-953b-ff70f5082420-kube-api-access-nplw6\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.670237 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.670286 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.675270 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.675801 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.691670 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplw6\" (UniqueName: \"kubernetes.io/projected/31eef1e3-8bc9-4b01-953b-ff70f5082420-kube-api-access-nplw6\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:57 crc kubenswrapper[4834]: I0130 21:45:57.834890 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:45:58 crc kubenswrapper[4834]: I0130 21:45:58.111266 4834 scope.go:117] "RemoveContainer" containerID="c97e8cdfa01d0bb281b88ac1712646d6ee18d6bc794750bcf01e43f25d0e848a" Jan 30 21:45:58 crc kubenswrapper[4834]: I0130 21:45:58.267853 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj"] Jan 30 21:45:58 crc kubenswrapper[4834]: I0130 21:45:58.402685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" event={"ID":"31eef1e3-8bc9-4b01-953b-ff70f5082420","Type":"ContainerStarted","Data":"0c45a67120c5cc8a92b4df2da4d858f4cb5f9bc3258eb0a294d2e5bbc558be4a"} Jan 30 21:46:00 crc kubenswrapper[4834]: I0130 21:46:00.435193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" event={"ID":"31eef1e3-8bc9-4b01-953b-ff70f5082420","Type":"ContainerStarted","Data":"553be0d1508af77f6656960a5fd39388267744a6be89a5797f7cd9e44932c70b"} Jan 30 21:46:00 crc kubenswrapper[4834]: I0130 21:46:00.457221 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" podStartSLOduration=2.069401382 podStartE2EDuration="3.457197244s" podCreationTimestamp="2026-01-30 21:45:57 +0000 UTC" firstStartedPulling="2026-01-30 21:45:58.270934323 +0000 UTC m=+1809.424080461" lastFinishedPulling="2026-01-30 21:45:59.658730185 +0000 UTC m=+1810.811876323" observedRunningTime="2026-01-30 21:46:00.453903582 +0000 UTC m=+1811.607049730" watchObservedRunningTime="2026-01-30 21:46:00.457197244 +0000 UTC m=+1811.610343382" Jan 30 21:46:04 crc kubenswrapper[4834]: I0130 21:46:04.531200 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:46:04 crc kubenswrapper[4834]: E0130 21:46:04.532210 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:46:18 crc kubenswrapper[4834]: I0130 21:46:18.531898 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:46:18 crc kubenswrapper[4834]: E0130 21:46:18.533069 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:46:32 crc kubenswrapper[4834]: I0130 21:46:32.531556 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:46:32 crc kubenswrapper[4834]: E0130 21:46:32.532923 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:46:36 crc kubenswrapper[4834]: I0130 21:46:36.048880 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-tq9jt"] Jan 30 21:46:36 crc kubenswrapper[4834]: I0130 21:46:36.062193 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xgmcw"] Jan 30 21:46:36 crc kubenswrapper[4834]: I0130 21:46:36.074523 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-tq9jt"] Jan 30 21:46:36 crc kubenswrapper[4834]: I0130 21:46:36.085191 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xgmcw"] Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.041223 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2b27-account-create-update-xn4jd"] Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.052523 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5c0a-account-create-update-qgtjv"] Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.064977 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2b27-account-create-update-xn4jd"] Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.075226 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5c0a-account-create-update-qgtjv"] Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.574335 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c5a9d6-cb42-4851-b02d-9e8664c51bf4" path="/var/lib/kubelet/pods/32c5a9d6-cb42-4851-b02d-9e8664c51bf4/volumes" Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.575027 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b1b391-8f59-474c-9842-adcfedf5cee3" path="/var/lib/kubelet/pods/39b1b391-8f59-474c-9842-adcfedf5cee3/volumes" Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.575784 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b92e53-1808-4b80-9b1d-413bbb97933d" path="/var/lib/kubelet/pods/a6b92e53-1808-4b80-9b1d-413bbb97933d/volumes" Jan 30 21:46:37 crc kubenswrapper[4834]: I0130 21:46:37.576403 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d" path="/var/lib/kubelet/pods/d60c70e9-9d5e-42f4-ae4f-2dfe0549ec5d/volumes" Jan 30 21:46:41 crc kubenswrapper[4834]: I0130 21:46:41.047993 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-747b-account-create-update-g5s5w"] Jan 30 21:46:41 crc kubenswrapper[4834]: I0130 21:46:41.060746 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6g4tw"] Jan 30 21:46:41 crc kubenswrapper[4834]: I0130 21:46:41.074309 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-747b-account-create-update-g5s5w"] Jan 30 21:46:41 crc kubenswrapper[4834]: I0130 21:46:41.083908 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6g4tw"] Jan 30 21:46:41 crc kubenswrapper[4834]: I0130 21:46:41.545722 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e0d2ef6-0a27-40ca-89d8-6508620ef387" path="/var/lib/kubelet/pods/1e0d2ef6-0a27-40ca-89d8-6508620ef387/volumes" Jan 30 21:46:41 crc kubenswrapper[4834]: I0130 21:46:41.546554 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c984ce30-d436-4965-83dd-3d07ec99edde" path="/var/lib/kubelet/pods/c984ce30-d436-4965-83dd-3d07ec99edde/volumes" Jan 30 21:46:46 crc kubenswrapper[4834]: I0130 21:46:46.531054 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:46:46 crc kubenswrapper[4834]: E0130 21:46:46.532002 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.241116 4834 scope.go:117] "RemoveContainer" containerID="2ebefbca99ac1cd2493ef0753e3f52ad8e469a4ce01115609906e4ceec0d4f2e" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.277051 4834 scope.go:117] "RemoveContainer" containerID="dd9046c1575a0c00f39dc89856f4f2885d66221155294a2de23d47b858b8d27a" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.311013 4834 scope.go:117] "RemoveContainer" containerID="82263ce65f875c23e68e26efe3d10bb03d87f2c848a17824194d30cca02217e2" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.373839 4834 scope.go:117] "RemoveContainer" containerID="c7c583795bb10848d52ffc70341935de3c4d68870b1e18ad5998254ca101f43d" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.406126 4834 scope.go:117] "RemoveContainer" containerID="b980abe12aafe5d38e5f8768127d2ff0d6968da531333e1578c7114dfa76b10d" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.488800 4834 scope.go:117] "RemoveContainer" containerID="f475e99c2fbcc941b45c3a49f489cccc1546c7a1922802c751012d73f2226352" Jan 30 21:46:58 crc kubenswrapper[4834]: I0130 21:46:58.531706 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:46:58 crc kubenswrapper[4834]: E0130 21:46:58.532079 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.057271 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rhwh6"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.076032 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dfde-account-create-update-m4mzz"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.090062 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rrsl9"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.101558 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2700-account-create-update-kqx8q"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.110628 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5d0f-account-create-update-wjvkc"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.118809 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dfde-account-create-update-m4mzz"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.128084 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6vxnz"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.137517 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5xww2"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.145737 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rhwh6"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.154616 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rrsl9"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.163261 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5xww2"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.171673 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2700-account-create-update-kqx8q"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.181106 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6vxnz"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.189847 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5d0f-account-create-update-wjvkc"] Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.547313 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06bf42dd-19ce-4bef-af7c-4777a04fbdca" path="/var/lib/kubelet/pods/06bf42dd-19ce-4bef-af7c-4777a04fbdca/volumes" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.548763 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12040694-1a05-486f-a71d-eb6940d21985" path="/var/lib/kubelet/pods/12040694-1a05-486f-a71d-eb6940d21985/volumes" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.549593 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37293f9c-d144-4553-ab7c-04caf3e28d18" path="/var/lib/kubelet/pods/37293f9c-d144-4553-ab7c-04caf3e28d18/volumes" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.550368 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4" path="/var/lib/kubelet/pods/4f3ada87-7f34-4bba-bddb-4ca05ccfd6f4/volumes" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.551914 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bd7fa6-422c-44c1-8d65-569d82a716f1" path="/var/lib/kubelet/pods/63bd7fa6-422c-44c1-8d65-569d82a716f1/volumes" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.552681 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7" path="/var/lib/kubelet/pods/83a9e5ec-4af7-4d23-8e22-bc9187a3f4a7/volumes" Jan 30 21:47:03 crc kubenswrapper[4834]: I0130 21:47:03.553452 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b295ca4-d536-4367-92e2-746945500e9a" path="/var/lib/kubelet/pods/8b295ca4-d536-4367-92e2-746945500e9a/volumes" Jan 30 21:47:06 crc kubenswrapper[4834]: I0130 21:47:06.036443 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mdpvh"] Jan 30 21:47:06 crc kubenswrapper[4834]: I0130 21:47:06.045022 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mdpvh"] Jan 30 21:47:07 crc kubenswrapper[4834]: I0130 21:47:07.041111 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zct6g"] Jan 30 21:47:07 crc kubenswrapper[4834]: I0130 21:47:07.058698 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zct6g"] Jan 30 21:47:07 crc kubenswrapper[4834]: I0130 21:47:07.562492 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a073f1-b8e0-4365-ae24-c4e2351a7f0e" path="/var/lib/kubelet/pods/95a073f1-b8e0-4365-ae24-c4e2351a7f0e/volumes" Jan 30 21:47:07 crc kubenswrapper[4834]: I0130 21:47:07.563658 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88d2fad-923e-4f77-b8f2-d8be1858e1ff" path="/var/lib/kubelet/pods/b88d2fad-923e-4f77-b8f2-d8be1858e1ff/volumes" Jan 30 21:47:09 crc kubenswrapper[4834]: I0130 21:47:09.547065 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:47:10 crc kubenswrapper[4834]: I0130 21:47:10.155283 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"3ecbc7678f260ea9142896ba0b31a8f28c33db200a7138d092326369289802d4"} Jan 30 21:47:46 crc kubenswrapper[4834]: I0130 21:47:46.084743 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8m5jq"] Jan 30 21:47:46 crc kubenswrapper[4834]: I0130 21:47:46.098558 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8m5jq"] Jan 30 21:47:47 crc kubenswrapper[4834]: I0130 21:47:47.546734 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11884add-09c1-45c0-92fc-9da5ebd51af3" path="/var/lib/kubelet/pods/11884add-09c1-45c0-92fc-9da5ebd51af3/volumes" Jan 30 21:47:56 crc kubenswrapper[4834]: I0130 21:47:56.049918 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-5v96n"] Jan 30 21:47:56 crc kubenswrapper[4834]: I0130 21:47:56.062194 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-5v96n"] Jan 30 21:47:57 crc kubenswrapper[4834]: I0130 21:47:57.552476 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814e680c-6380-4ab3-a481-2f9afe8b88ff" path="/var/lib/kubelet/pods/814e680c-6380-4ab3-a481-2f9afe8b88ff/volumes" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.678746 4834 scope.go:117] "RemoveContainer" containerID="77608a94523b85eae42a84752cfa4a13d422e2d663f81e65e453670d6c8d72b5" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.724042 4834 scope.go:117] "RemoveContainer" containerID="b5e18427c6c4628454ea795bb4f87a226674af23856a6b2f67e8979bafdc21d9" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.781841 4834 scope.go:117] "RemoveContainer" containerID="359014c5a5c906e4181ee98cf98574f0f2a28dbdd4a849cf7f8f605dc9d542df" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.842043 4834 scope.go:117] "RemoveContainer" containerID="b296177a63a222d735441c2738ca905fc4aa4146d067cda310e73d329e189780" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.878978 4834 scope.go:117] "RemoveContainer" containerID="7aaff3e86e401febc7b069870846ac8dfd8d8de37d3d393ac04f337241eb64fc" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.918100 4834 scope.go:117] "RemoveContainer" containerID="7a253c696c1b6a1eab036d4bb5186d4d347b74150d2d271133bcc8cd9ad5c3fd" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.974847 4834 scope.go:117] "RemoveContainer" containerID="bdfeb13905025438bfd0e6ac00ee5aadc6e9d440ae4f7227c957f5c0741fcb56" Jan 30 21:47:58 crc kubenswrapper[4834]: I0130 21:47:58.995912 4834 scope.go:117] "RemoveContainer" containerID="1ec86ae8e8fc32f2fcb9d9b7ed35ee2b2dce03287356b82f42e482fec1b3914c" Jan 30 21:47:59 crc kubenswrapper[4834]: I0130 21:47:59.015170 4834 scope.go:117] "RemoveContainer" containerID="54d881f2ae455fd50bee56d740d60633535ae77375bdadbfcff9928bbf3b2e7e" Jan 30 21:47:59 crc kubenswrapper[4834]: I0130 21:47:59.042779 4834 scope.go:117] "RemoveContainer" containerID="1fe311a630beda5bd922a6a19cb7669d9ce74d490be2e624cc44505c6a89359f" Jan 30 21:47:59 crc kubenswrapper[4834]: I0130 21:47:59.072848 4834 scope.go:117] "RemoveContainer" containerID="751bcc3a49f6dccdd019572068545ab1cf7d6e190f1668a8133345d583109042" Jan 30 21:48:04 crc kubenswrapper[4834]: I0130 21:48:04.036701 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ftcrs"] Jan 30 21:48:04 crc kubenswrapper[4834]: I0130 21:48:04.051026 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ftcrs"] Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.041976 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-gvxl7"] Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.049404 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2ngf2"] Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.057861 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-gvxl7"] Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.065885 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2ngf2"] Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.543775 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c62e65-34f5-4f5c-83fd-9af56b711bac" path="/var/lib/kubelet/pods/25c62e65-34f5-4f5c-83fd-9af56b711bac/volumes" Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.544597 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49f42b3-c35a-4138-89e6-95f7abfa23bb" path="/var/lib/kubelet/pods/b49f42b3-c35a-4138-89e6-95f7abfa23bb/volumes" Jan 30 21:48:05 crc kubenswrapper[4834]: I0130 21:48:05.545174 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb10ef46-028a-4e54-a587-4843dae377f9" path="/var/lib/kubelet/pods/bb10ef46-028a-4e54-a587-4843dae377f9/volumes" Jan 30 21:48:42 crc kubenswrapper[4834]: I0130 21:48:42.144384 4834 generic.go:334] "Generic (PLEG): container finished" podID="31eef1e3-8bc9-4b01-953b-ff70f5082420" containerID="553be0d1508af77f6656960a5fd39388267744a6be89a5797f7cd9e44932c70b" exitCode=0 Jan 30 21:48:42 crc kubenswrapper[4834]: I0130 21:48:42.144473 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" event={"ID":"31eef1e3-8bc9-4b01-953b-ff70f5082420","Type":"ContainerDied","Data":"553be0d1508af77f6656960a5fd39388267744a6be89a5797f7cd9e44932c70b"} Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.632161 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.704461 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nplw6\" (UniqueName: \"kubernetes.io/projected/31eef1e3-8bc9-4b01-953b-ff70f5082420-kube-api-access-nplw6\") pod \"31eef1e3-8bc9-4b01-953b-ff70f5082420\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.704804 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-ssh-key-openstack-edpm-ipam\") pod \"31eef1e3-8bc9-4b01-953b-ff70f5082420\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.704981 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-inventory\") pod \"31eef1e3-8bc9-4b01-953b-ff70f5082420\" (UID: \"31eef1e3-8bc9-4b01-953b-ff70f5082420\") " Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.709892 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31eef1e3-8bc9-4b01-953b-ff70f5082420-kube-api-access-nplw6" (OuterVolumeSpecName: "kube-api-access-nplw6") pod "31eef1e3-8bc9-4b01-953b-ff70f5082420" (UID: "31eef1e3-8bc9-4b01-953b-ff70f5082420"). InnerVolumeSpecName "kube-api-access-nplw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.737483 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-inventory" (OuterVolumeSpecName: "inventory") pod "31eef1e3-8bc9-4b01-953b-ff70f5082420" (UID: "31eef1e3-8bc9-4b01-953b-ff70f5082420"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.737856 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31eef1e3-8bc9-4b01-953b-ff70f5082420" (UID: "31eef1e3-8bc9-4b01-953b-ff70f5082420"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.806962 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.807133 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nplw6\" (UniqueName: \"kubernetes.io/projected/31eef1e3-8bc9-4b01-953b-ff70f5082420-kube-api-access-nplw6\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:43 crc kubenswrapper[4834]: I0130 21:48:43.807216 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31eef1e3-8bc9-4b01-953b-ff70f5082420-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.168629 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" event={"ID":"31eef1e3-8bc9-4b01-953b-ff70f5082420","Type":"ContainerDied","Data":"0c45a67120c5cc8a92b4df2da4d858f4cb5f9bc3258eb0a294d2e5bbc558be4a"} Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.168735 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c45a67120c5cc8a92b4df2da4d858f4cb5f9bc3258eb0a294d2e5bbc558be4a" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.168767 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.281496 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt"] Jan 30 21:48:44 crc kubenswrapper[4834]: E0130 21:48:44.281908 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eef1e3-8bc9-4b01-953b-ff70f5082420" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.281925 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eef1e3-8bc9-4b01-953b-ff70f5082420" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.282114 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eef1e3-8bc9-4b01-953b-ff70f5082420" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.282819 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.284404 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.284965 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.285412 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.285636 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.293734 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt"] Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.422658 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.422710 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp5f8\" (UniqueName: \"kubernetes.io/projected/2ea3b15b-f04b-4690-bb26-e2ec96781265-kube-api-access-qp5f8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.422822 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.525288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.525533 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.525588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp5f8\" (UniqueName: \"kubernetes.io/projected/2ea3b15b-f04b-4690-bb26-e2ec96781265-kube-api-access-qp5f8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.535249 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.539260 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.553656 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp5f8\" (UniqueName: \"kubernetes.io/projected/2ea3b15b-f04b-4690-bb26-e2ec96781265-kube-api-access-qp5f8\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-px4dt\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:44 crc kubenswrapper[4834]: I0130 21:48:44.601480 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:48:45 crc kubenswrapper[4834]: I0130 21:48:45.052923 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-v4rg6"] Jan 30 21:48:45 crc kubenswrapper[4834]: I0130 21:48:45.065321 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-v4rg6"] Jan 30 21:48:45 crc kubenswrapper[4834]: I0130 21:48:45.188525 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt"] Jan 30 21:48:45 crc kubenswrapper[4834]: I0130 21:48:45.545835 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c39bab-7d70-4cf5-88cb-c7c61d3199a2" path="/var/lib/kubelet/pods/70c39bab-7d70-4cf5-88cb-c7c61d3199a2/volumes" Jan 30 21:48:46 crc kubenswrapper[4834]: I0130 21:48:46.046215 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-35d8-account-create-update-7rznt"] Jan 30 21:48:46 crc kubenswrapper[4834]: I0130 21:48:46.059120 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-35d8-account-create-update-7rznt"] Jan 30 21:48:46 crc kubenswrapper[4834]: I0130 21:48:46.193304 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" event={"ID":"2ea3b15b-f04b-4690-bb26-e2ec96781265","Type":"ContainerStarted","Data":"10fccaf2d00c0657adade935e485514ed388afafc3d99b81ded55f92924f4653"} Jan 30 21:48:46 crc kubenswrapper[4834]: I0130 21:48:46.193491 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" event={"ID":"2ea3b15b-f04b-4690-bb26-e2ec96781265","Type":"ContainerStarted","Data":"4e50b1f160d69a963dee8b07873c3d983fc6fc9cd24ebbff13f208af70644df4"} Jan 30 21:48:46 crc kubenswrapper[4834]: I0130 21:48:46.229987 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" podStartSLOduration=1.782715328 podStartE2EDuration="2.229963476s" podCreationTimestamp="2026-01-30 21:48:44 +0000 UTC" firstStartedPulling="2026-01-30 21:48:45.190252791 +0000 UTC m=+1976.343398949" lastFinishedPulling="2026-01-30 21:48:45.637500959 +0000 UTC m=+1976.790647097" observedRunningTime="2026-01-30 21:48:46.212590507 +0000 UTC m=+1977.365736675" watchObservedRunningTime="2026-01-30 21:48:46.229963476 +0000 UTC m=+1977.383109624" Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.044873 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-krbbm"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.056758 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2dkk4"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.064700 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-98df-account-create-update-cdwj4"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.078592 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2dkk4"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.086554 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-krbbm"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.099676 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2910-account-create-update-wz247"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.107429 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-98df-account-create-update-cdwj4"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.114854 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2910-account-create-update-wz247"] Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.562354 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451d67a3-3dc8-4b25-9e88-b5e6e16fdb27" path="/var/lib/kubelet/pods/451d67a3-3dc8-4b25-9e88-b5e6e16fdb27/volumes" Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.563464 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51bc6106-f66d-4945-8ec9-91de63f2d579" path="/var/lib/kubelet/pods/51bc6106-f66d-4945-8ec9-91de63f2d579/volumes" Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.565602 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="610bc5a1-2033-4f01-8ee9-e02c596fc94f" path="/var/lib/kubelet/pods/610bc5a1-2033-4f01-8ee9-e02c596fc94f/volumes" Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.570538 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8547e76b-acd4-45ea-a2ba-308e26be62b5" path="/var/lib/kubelet/pods/8547e76b-acd4-45ea-a2ba-308e26be62b5/volumes" Jan 30 21:48:47 crc kubenswrapper[4834]: I0130 21:48:47.571693 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf239124-75b5-4aa0-ade1-af07156f6b14" path="/var/lib/kubelet/pods/bf239124-75b5-4aa0-ade1-af07156f6b14/volumes" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.316533 4834 scope.go:117] "RemoveContainer" containerID="e3c7c0a01900b2a8ac0ea0a429298d2283cec4e9adc0bca59be91e837502e5b7" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.367059 4834 scope.go:117] "RemoveContainer" containerID="f283daa4b2072b17b52843506aefba9dd90c8c44b34da2157b5091272362becd" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.431312 4834 scope.go:117] "RemoveContainer" containerID="13c38d96cb9a4a020b8857f220884dd914750856d66040a7e82963dbdde3f53f" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.472602 4834 scope.go:117] "RemoveContainer" containerID="3e7ec2eafede4e5b75a15bc96c461c617bfe6bc2a6ddad530a1360ad09a8f347" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.551625 4834 scope.go:117] "RemoveContainer" containerID="eacb214f7980e4aed57c5a903cfebb3c7d4b92d18752d9d10186e4d6df520eef" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.583477 4834 scope.go:117] "RemoveContainer" containerID="7e6444a0b6abb0edabfa3eb062969ad4892f3229737d07dd920eafb2b5254276" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.628473 4834 scope.go:117] "RemoveContainer" containerID="b539f7e809f078f0f410bcca6733678fc479b9adca829469beb7734a5beb229f" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.651685 4834 scope.go:117] "RemoveContainer" containerID="2d9cf2b23744cdea3e0b754b567ce60376c5e9a12a16a5b72d9b51e6a2a6bac2" Jan 30 21:48:59 crc kubenswrapper[4834]: I0130 21:48:59.670502 4834 scope.go:117] "RemoveContainer" containerID="34f089620cdd871efeed60f345b16afe7296eb37e3f22d711c4d68c2b1f70783" Jan 30 21:49:23 crc kubenswrapper[4834]: I0130 21:49:23.035995 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7sj55"] Jan 30 21:49:23 crc kubenswrapper[4834]: I0130 21:49:23.046019 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7sj55"] Jan 30 21:49:23 crc kubenswrapper[4834]: I0130 21:49:23.544192 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="665b0f39-b1d4-431f-8741-90c0b6d31d52" path="/var/lib/kubelet/pods/665b0f39-b1d4-431f-8741-90c0b6d31d52/volumes" Jan 30 21:49:34 crc kubenswrapper[4834]: I0130 21:49:34.161635 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:49:34 crc kubenswrapper[4834]: I0130 21:49:34.162267 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:49:51 crc kubenswrapper[4834]: I0130 21:49:51.045647 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vq77c"] Jan 30 21:49:51 crc kubenswrapper[4834]: I0130 21:49:51.054291 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vq77c"] Jan 30 21:49:51 crc kubenswrapper[4834]: I0130 21:49:51.550531 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9628df07-52b2-4e8b-9b6b-a1964c59ea27" path="/var/lib/kubelet/pods/9628df07-52b2-4e8b-9b6b-a1964c59ea27/volumes" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.560623 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qnw6l"] Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.563716 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.572570 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qnw6l"] Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.684932 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-catalog-content\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.685039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-utilities\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.685456 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vksc\" (UniqueName: \"kubernetes.io/projected/2a4db272-1623-4bf3-ab21-30f44941bd55-kube-api-access-4vksc\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.787725 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-catalog-content\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.787811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-utilities\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.787955 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vksc\" (UniqueName: \"kubernetes.io/projected/2a4db272-1623-4bf3-ab21-30f44941bd55-kube-api-access-4vksc\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.788282 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-catalog-content\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.788646 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-utilities\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.812659 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vksc\" (UniqueName: \"kubernetes.io/projected/2a4db272-1623-4bf3-ab21-30f44941bd55-kube-api-access-4vksc\") pod \"certified-operators-qnw6l\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:55 crc kubenswrapper[4834]: I0130 21:49:55.889354 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:49:56 crc kubenswrapper[4834]: I0130 21:49:56.398071 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qnw6l"] Jan 30 21:49:56 crc kubenswrapper[4834]: I0130 21:49:56.915882 4834 generic.go:334] "Generic (PLEG): container finished" podID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerID="76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144" exitCode=0 Jan 30 21:49:56 crc kubenswrapper[4834]: I0130 21:49:56.915924 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerDied","Data":"76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144"} Jan 30 21:49:56 crc kubenswrapper[4834]: I0130 21:49:56.916215 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerStarted","Data":"bd70cc95b9d50063a725f7964ff424df7b936ce5e169ed15fac480b84467ffe6"} Jan 30 21:49:58 crc kubenswrapper[4834]: I0130 21:49:58.943201 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerStarted","Data":"90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1"} Jan 30 21:49:59 crc kubenswrapper[4834]: I0130 21:49:59.874341 4834 scope.go:117] "RemoveContainer" containerID="cdd3c241810cf3af99121cf9e569a6eeb1e998a40b2ce8c5e35d839e5362a59c" Jan 30 21:49:59 crc kubenswrapper[4834]: I0130 21:49:59.933488 4834 scope.go:117] "RemoveContainer" containerID="48a941c4cb471658f4609af2ce965991f76cb6cc26499a54ecacbe135e58b6ed" Jan 30 21:50:00 crc kubenswrapper[4834]: I0130 21:50:00.971711 4834 generic.go:334] "Generic (PLEG): container finished" podID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerID="90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1" exitCode=0 Jan 30 21:50:00 crc kubenswrapper[4834]: I0130 21:50:00.972049 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerDied","Data":"90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1"} Jan 30 21:50:02 crc kubenswrapper[4834]: I0130 21:50:02.992218 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerStarted","Data":"6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d"} Jan 30 21:50:03 crc kubenswrapper[4834]: I0130 21:50:03.019550 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qnw6l" podStartSLOduration=2.619251009 podStartE2EDuration="8.019529572s" podCreationTimestamp="2026-01-30 21:49:55 +0000 UTC" firstStartedPulling="2026-01-30 21:49:56.91805438 +0000 UTC m=+2048.071200518" lastFinishedPulling="2026-01-30 21:50:02.318332943 +0000 UTC m=+2053.471479081" observedRunningTime="2026-01-30 21:50:03.00918107 +0000 UTC m=+2054.162327218" watchObservedRunningTime="2026-01-30 21:50:03.019529572 +0000 UTC m=+2054.172675730" Jan 30 21:50:04 crc kubenswrapper[4834]: I0130 21:50:04.160888 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:50:04 crc kubenswrapper[4834]: I0130 21:50:04.160982 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:50:05 crc kubenswrapper[4834]: I0130 21:50:05.890966 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:50:05 crc kubenswrapper[4834]: I0130 21:50:05.891300 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:50:05 crc kubenswrapper[4834]: I0130 21:50:05.936891 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:50:15 crc kubenswrapper[4834]: I0130 21:50:15.940738 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.002036 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qnw6l"] Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.120185 4834 generic.go:334] "Generic (PLEG): container finished" podID="2ea3b15b-f04b-4690-bb26-e2ec96781265" containerID="10fccaf2d00c0657adade935e485514ed388afafc3d99b81ded55f92924f4653" exitCode=0 Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.120271 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" event={"ID":"2ea3b15b-f04b-4690-bb26-e2ec96781265","Type":"ContainerDied","Data":"10fccaf2d00c0657adade935e485514ed388afafc3d99b81ded55f92924f4653"} Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.120455 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qnw6l" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="registry-server" containerID="cri-o://6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d" gracePeriod=2 Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.668250 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.806244 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-utilities\") pod \"2a4db272-1623-4bf3-ab21-30f44941bd55\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.806759 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-catalog-content\") pod \"2a4db272-1623-4bf3-ab21-30f44941bd55\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.806798 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vksc\" (UniqueName: \"kubernetes.io/projected/2a4db272-1623-4bf3-ab21-30f44941bd55-kube-api-access-4vksc\") pod \"2a4db272-1623-4bf3-ab21-30f44941bd55\" (UID: \"2a4db272-1623-4bf3-ab21-30f44941bd55\") " Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.807523 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-utilities" (OuterVolumeSpecName: "utilities") pod "2a4db272-1623-4bf3-ab21-30f44941bd55" (UID: "2a4db272-1623-4bf3-ab21-30f44941bd55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.814757 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4db272-1623-4bf3-ab21-30f44941bd55-kube-api-access-4vksc" (OuterVolumeSpecName: "kube-api-access-4vksc") pod "2a4db272-1623-4bf3-ab21-30f44941bd55" (UID: "2a4db272-1623-4bf3-ab21-30f44941bd55"). InnerVolumeSpecName "kube-api-access-4vksc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.846182 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a4db272-1623-4bf3-ab21-30f44941bd55" (UID: "2a4db272-1623-4bf3-ab21-30f44941bd55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.908737 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.908771 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a4db272-1623-4bf3-ab21-30f44941bd55-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:16 crc kubenswrapper[4834]: I0130 21:50:16.908783 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vksc\" (UniqueName: \"kubernetes.io/projected/2a4db272-1623-4bf3-ab21-30f44941bd55-kube-api-access-4vksc\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.139899 4834 generic.go:334] "Generic (PLEG): container finished" podID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerID="6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d" exitCode=0 Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.140232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerDied","Data":"6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d"} Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.140269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qnw6l" event={"ID":"2a4db272-1623-4bf3-ab21-30f44941bd55","Type":"ContainerDied","Data":"bd70cc95b9d50063a725f7964ff424df7b936ce5e169ed15fac480b84467ffe6"} Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.140300 4834 scope.go:117] "RemoveContainer" containerID="6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.140992 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qnw6l" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.191695 4834 scope.go:117] "RemoveContainer" containerID="90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.199549 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qnw6l"] Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.215134 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qnw6l"] Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.240676 4834 scope.go:117] "RemoveContainer" containerID="76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.283333 4834 scope.go:117] "RemoveContainer" containerID="6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d" Jan 30 21:50:17 crc kubenswrapper[4834]: E0130 21:50:17.285075 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d\": container with ID starting with 6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d not found: ID does not exist" containerID="6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.285101 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d"} err="failed to get container status \"6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d\": rpc error: code = NotFound desc = could not find container \"6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d\": container with ID starting with 6a47d3514cf07680be17c2366ff7d6376462d35a38a19af2c6f451c5b5ca2c0d not found: ID does not exist" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.285122 4834 scope.go:117] "RemoveContainer" containerID="90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1" Jan 30 21:50:17 crc kubenswrapper[4834]: E0130 21:50:17.285471 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1\": container with ID starting with 90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1 not found: ID does not exist" containerID="90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.285497 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1"} err="failed to get container status \"90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1\": rpc error: code = NotFound desc = could not find container \"90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1\": container with ID starting with 90ce55d335d4338c55a04249827d6f7cb7741c69ac8a04825f19803decfcdeb1 not found: ID does not exist" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.285516 4834 scope.go:117] "RemoveContainer" containerID="76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144" Jan 30 21:50:17 crc kubenswrapper[4834]: E0130 21:50:17.286396 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144\": container with ID starting with 76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144 not found: ID does not exist" containerID="76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.286432 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144"} err="failed to get container status \"76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144\": rpc error: code = NotFound desc = could not find container \"76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144\": container with ID starting with 76c601f4afbd6fbd79f0bca7596453920393d8681d9d17a99df9730cdb9b5144 not found: ID does not exist" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.545596 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" path="/var/lib/kubelet/pods/2a4db272-1623-4bf3-ab21-30f44941bd55/volumes" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.653566 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.742384 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp5f8\" (UniqueName: \"kubernetes.io/projected/2ea3b15b-f04b-4690-bb26-e2ec96781265-kube-api-access-qp5f8\") pod \"2ea3b15b-f04b-4690-bb26-e2ec96781265\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.742577 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-ssh-key-openstack-edpm-ipam\") pod \"2ea3b15b-f04b-4690-bb26-e2ec96781265\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.742657 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-inventory\") pod \"2ea3b15b-f04b-4690-bb26-e2ec96781265\" (UID: \"2ea3b15b-f04b-4690-bb26-e2ec96781265\") " Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.746734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea3b15b-f04b-4690-bb26-e2ec96781265-kube-api-access-qp5f8" (OuterVolumeSpecName: "kube-api-access-qp5f8") pod "2ea3b15b-f04b-4690-bb26-e2ec96781265" (UID: "2ea3b15b-f04b-4690-bb26-e2ec96781265"). InnerVolumeSpecName "kube-api-access-qp5f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.777275 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2ea3b15b-f04b-4690-bb26-e2ec96781265" (UID: "2ea3b15b-f04b-4690-bb26-e2ec96781265"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.779720 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-inventory" (OuterVolumeSpecName: "inventory") pod "2ea3b15b-f04b-4690-bb26-e2ec96781265" (UID: "2ea3b15b-f04b-4690-bb26-e2ec96781265"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.844504 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp5f8\" (UniqueName: \"kubernetes.io/projected/2ea3b15b-f04b-4690-bb26-e2ec96781265-kube-api-access-qp5f8\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.844542 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:17 crc kubenswrapper[4834]: I0130 21:50:17.844558 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ea3b15b-f04b-4690-bb26-e2ec96781265-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.151793 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.151793 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-px4dt" event={"ID":"2ea3b15b-f04b-4690-bb26-e2ec96781265","Type":"ContainerDied","Data":"4e50b1f160d69a963dee8b07873c3d983fc6fc9cd24ebbff13f208af70644df4"} Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.151940 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e50b1f160d69a963dee8b07873c3d983fc6fc9cd24ebbff13f208af70644df4" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.248679 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf"] Jan 30 21:50:18 crc kubenswrapper[4834]: E0130 21:50:18.249333 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="registry-server" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.249348 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="registry-server" Jan 30 21:50:18 crc kubenswrapper[4834]: E0130 21:50:18.249381 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea3b15b-f04b-4690-bb26-e2ec96781265" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.249390 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea3b15b-f04b-4690-bb26-e2ec96781265" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:18 crc kubenswrapper[4834]: E0130 21:50:18.249429 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="extract-utilities" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.249435 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="extract-utilities" Jan 30 21:50:18 crc kubenswrapper[4834]: E0130 21:50:18.249448 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="extract-content" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.249453 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="extract-content" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.249615 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea3b15b-f04b-4690-bb26-e2ec96781265" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.249628 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4db272-1623-4bf3-ab21-30f44941bd55" containerName="registry-server" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.250453 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.254243 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.254257 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.254676 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.259837 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.266269 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf"] Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.352543 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.352666 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.352774 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8r5m\" (UniqueName: \"kubernetes.io/projected/51e87d5c-a737-4250-b885-3d7ebbd2c803-kube-api-access-z8r5m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.454745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.454856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.454940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8r5m\" (UniqueName: \"kubernetes.io/projected/51e87d5c-a737-4250-b885-3d7ebbd2c803-kube-api-access-z8r5m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.461093 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.462616 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.470743 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8r5m\" (UniqueName: \"kubernetes.io/projected/51e87d5c-a737-4250-b885-3d7ebbd2c803-kube-api-access-z8r5m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7qblf\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:18 crc kubenswrapper[4834]: I0130 21:50:18.566653 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:19 crc kubenswrapper[4834]: W0130 21:50:19.195563 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51e87d5c_a737_4250_b885_3d7ebbd2c803.slice/crio-e71ed4ae08936ad5de625b803d5ae7deb1aeef9e68f4a165c67a739321edf235 WatchSource:0}: Error finding container e71ed4ae08936ad5de625b803d5ae7deb1aeef9e68f4a165c67a739321edf235: Status 404 returned error can't find the container with id e71ed4ae08936ad5de625b803d5ae7deb1aeef9e68f4a165c67a739321edf235 Jan 30 21:50:19 crc kubenswrapper[4834]: I0130 21:50:19.197156 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf"] Jan 30 21:50:20 crc kubenswrapper[4834]: I0130 21:50:20.171870 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" event={"ID":"51e87d5c-a737-4250-b885-3d7ebbd2c803","Type":"ContainerStarted","Data":"e71ed4ae08936ad5de625b803d5ae7deb1aeef9e68f4a165c67a739321edf235"} Jan 30 21:50:21 crc kubenswrapper[4834]: I0130 21:50:21.187635 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" event={"ID":"51e87d5c-a737-4250-b885-3d7ebbd2c803","Type":"ContainerStarted","Data":"f32584fc3cb8f6f025fe4df60e3d764ca762abc50a3814f6454b787d08e56f8b"} Jan 30 21:50:21 crc kubenswrapper[4834]: I0130 21:50:21.218743 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" podStartSLOduration=2.144001568 podStartE2EDuration="3.218719894s" podCreationTimestamp="2026-01-30 21:50:18 +0000 UTC" firstStartedPulling="2026-01-30 21:50:19.198095449 +0000 UTC m=+2070.351241587" lastFinishedPulling="2026-01-30 21:50:20.272813775 +0000 UTC m=+2071.425959913" observedRunningTime="2026-01-30 21:50:21.21042141 +0000 UTC m=+2072.363567568" watchObservedRunningTime="2026-01-30 21:50:21.218719894 +0000 UTC m=+2072.371866072" Jan 30 21:50:25 crc kubenswrapper[4834]: I0130 21:50:25.041131 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkmbc"] Jan 30 21:50:25 crc kubenswrapper[4834]: I0130 21:50:25.049481 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkmbc"] Jan 30 21:50:25 crc kubenswrapper[4834]: I0130 21:50:25.220661 4834 generic.go:334] "Generic (PLEG): container finished" podID="51e87d5c-a737-4250-b885-3d7ebbd2c803" containerID="f32584fc3cb8f6f025fe4df60e3d764ca762abc50a3814f6454b787d08e56f8b" exitCode=0 Jan 30 21:50:25 crc kubenswrapper[4834]: I0130 21:50:25.220706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" event={"ID":"51e87d5c-a737-4250-b885-3d7ebbd2c803","Type":"ContainerDied","Data":"f32584fc3cb8f6f025fe4df60e3d764ca762abc50a3814f6454b787d08e56f8b"} Jan 30 21:50:25 crc kubenswrapper[4834]: I0130 21:50:25.550202 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e513111-c687-4c45-8262-7ce559c7decf" path="/var/lib/kubelet/pods/8e513111-c687-4c45-8262-7ce559c7decf/volumes" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.706004 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.855574 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8r5m\" (UniqueName: \"kubernetes.io/projected/51e87d5c-a737-4250-b885-3d7ebbd2c803-kube-api-access-z8r5m\") pod \"51e87d5c-a737-4250-b885-3d7ebbd2c803\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.856340 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-ssh-key-openstack-edpm-ipam\") pod \"51e87d5c-a737-4250-b885-3d7ebbd2c803\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.856945 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-inventory\") pod \"51e87d5c-a737-4250-b885-3d7ebbd2c803\" (UID: \"51e87d5c-a737-4250-b885-3d7ebbd2c803\") " Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.861885 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e87d5c-a737-4250-b885-3d7ebbd2c803-kube-api-access-z8r5m" (OuterVolumeSpecName: "kube-api-access-z8r5m") pod "51e87d5c-a737-4250-b885-3d7ebbd2c803" (UID: "51e87d5c-a737-4250-b885-3d7ebbd2c803"). InnerVolumeSpecName "kube-api-access-z8r5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.884509 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-inventory" (OuterVolumeSpecName: "inventory") pod "51e87d5c-a737-4250-b885-3d7ebbd2c803" (UID: "51e87d5c-a737-4250-b885-3d7ebbd2c803"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.895351 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "51e87d5c-a737-4250-b885-3d7ebbd2c803" (UID: "51e87d5c-a737-4250-b885-3d7ebbd2c803"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.961002 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.961050 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8r5m\" (UniqueName: \"kubernetes.io/projected/51e87d5c-a737-4250-b885-3d7ebbd2c803-kube-api-access-z8r5m\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:26 crc kubenswrapper[4834]: I0130 21:50:26.961074 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51e87d5c-a737-4250-b885-3d7ebbd2c803-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.242268 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" event={"ID":"51e87d5c-a737-4250-b885-3d7ebbd2c803","Type":"ContainerDied","Data":"e71ed4ae08936ad5de625b803d5ae7deb1aeef9e68f4a165c67a739321edf235"} Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.242313 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e71ed4ae08936ad5de625b803d5ae7deb1aeef9e68f4a165c67a739321edf235" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.242384 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7qblf" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.320562 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh"] Jan 30 21:50:27 crc kubenswrapper[4834]: E0130 21:50:27.321220 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e87d5c-a737-4250-b885-3d7ebbd2c803" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.321305 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e87d5c-a737-4250-b885-3d7ebbd2c803" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.321592 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e87d5c-a737-4250-b885-3d7ebbd2c803" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.322324 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.324495 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.329068 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.329345 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.329968 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.339741 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh"] Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.471324 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.471597 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.471720 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qbt6\" (UniqueName: \"kubernetes.io/projected/e7629752-891c-46ea-b8ca-4156789f68b3-kube-api-access-7qbt6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.573525 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.574331 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.574468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qbt6\" (UniqueName: \"kubernetes.io/projected/e7629752-891c-46ea-b8ca-4156789f68b3-kube-api-access-7qbt6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.579075 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.579179 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.590135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qbt6\" (UniqueName: \"kubernetes.io/projected/e7629752-891c-46ea-b8ca-4156789f68b3-kube-api-access-7qbt6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx7fh\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:27 crc kubenswrapper[4834]: I0130 21:50:27.640629 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:50:28 crc kubenswrapper[4834]: I0130 21:50:28.170795 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh"] Jan 30 21:50:28 crc kubenswrapper[4834]: I0130 21:50:28.179856 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:50:28 crc kubenswrapper[4834]: I0130 21:50:28.252895 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" event={"ID":"e7629752-891c-46ea-b8ca-4156789f68b3","Type":"ContainerStarted","Data":"e8450efec29b48cb5e68acb7a27d3f65aea94dee15ebe9eec52ed995c2378c20"} Jan 30 21:50:29 crc kubenswrapper[4834]: I0130 21:50:29.267698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" event={"ID":"e7629752-891c-46ea-b8ca-4156789f68b3","Type":"ContainerStarted","Data":"cb24e67ad35f1b82b53af58dbe70431383ec07e5cc8ccb60786471c7101a1fb3"} Jan 30 21:50:29 crc kubenswrapper[4834]: I0130 21:50:29.290718 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" podStartSLOduration=1.897351108 podStartE2EDuration="2.290696994s" podCreationTimestamp="2026-01-30 21:50:27 +0000 UTC" firstStartedPulling="2026-01-30 21:50:28.179643203 +0000 UTC m=+2079.332789341" lastFinishedPulling="2026-01-30 21:50:28.572989089 +0000 UTC m=+2079.726135227" observedRunningTime="2026-01-30 21:50:29.284089067 +0000 UTC m=+2080.437235205" watchObservedRunningTime="2026-01-30 21:50:29.290696994 +0000 UTC m=+2080.443843132" Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.161552 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.162053 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.162101 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.163029 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ecbc7678f260ea9142896ba0b31a8f28c33db200a7138d092326369289802d4"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.163117 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://3ecbc7678f260ea9142896ba0b31a8f28c33db200a7138d092326369289802d4" gracePeriod=600 Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.349735 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="3ecbc7678f260ea9142896ba0b31a8f28c33db200a7138d092326369289802d4" exitCode=0 Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.349804 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"3ecbc7678f260ea9142896ba0b31a8f28c33db200a7138d092326369289802d4"} Jan 30 21:50:34 crc kubenswrapper[4834]: I0130 21:50:34.349854 4834 scope.go:117] "RemoveContainer" containerID="a3ed0c1acd5ba306c85c3be1060c653cf036e922e899f97dc1a0a81cd84a184a" Jan 30 21:50:35 crc kubenswrapper[4834]: I0130 21:50:35.365576 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87"} Jan 30 21:50:45 crc kubenswrapper[4834]: I0130 21:50:45.046151 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8xqlg"] Jan 30 21:50:45 crc kubenswrapper[4834]: I0130 21:50:45.062175 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8xqlg"] Jan 30 21:50:45 crc kubenswrapper[4834]: I0130 21:50:45.542617 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb78972-66ad-4912-b43d-40c181fda896" path="/var/lib/kubelet/pods/0bb78972-66ad-4912-b43d-40c181fda896/volumes" Jan 30 21:51:00 crc kubenswrapper[4834]: I0130 21:51:00.054940 4834 scope.go:117] "RemoveContainer" containerID="71c249264e33b8778c50b76b46f62f8e5296fe02023a413d18833ac4fa4484fa" Jan 30 21:51:00 crc kubenswrapper[4834]: I0130 21:51:00.096851 4834 scope.go:117] "RemoveContainer" containerID="e41d8ddee34c397e7d949eb9a68e780c9a835c0c6f542f30abc6e93b93f3b631" Jan 30 21:51:05 crc kubenswrapper[4834]: I0130 21:51:05.711053 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7629752-891c-46ea-b8ca-4156789f68b3" containerID="cb24e67ad35f1b82b53af58dbe70431383ec07e5cc8ccb60786471c7101a1fb3" exitCode=0 Jan 30 21:51:05 crc kubenswrapper[4834]: I0130 21:51:05.711140 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" event={"ID":"e7629752-891c-46ea-b8ca-4156789f68b3","Type":"ContainerDied","Data":"cb24e67ad35f1b82b53af58dbe70431383ec07e5cc8ccb60786471c7101a1fb3"} Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.144454 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.256318 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qbt6\" (UniqueName: \"kubernetes.io/projected/e7629752-891c-46ea-b8ca-4156789f68b3-kube-api-access-7qbt6\") pod \"e7629752-891c-46ea-b8ca-4156789f68b3\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.256469 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-inventory\") pod \"e7629752-891c-46ea-b8ca-4156789f68b3\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.256552 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-ssh-key-openstack-edpm-ipam\") pod \"e7629752-891c-46ea-b8ca-4156789f68b3\" (UID: \"e7629752-891c-46ea-b8ca-4156789f68b3\") " Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.265613 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7629752-891c-46ea-b8ca-4156789f68b3-kube-api-access-7qbt6" (OuterVolumeSpecName: "kube-api-access-7qbt6") pod "e7629752-891c-46ea-b8ca-4156789f68b3" (UID: "e7629752-891c-46ea-b8ca-4156789f68b3"). InnerVolumeSpecName "kube-api-access-7qbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.291859 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7629752-891c-46ea-b8ca-4156789f68b3" (UID: "e7629752-891c-46ea-b8ca-4156789f68b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.301203 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-inventory" (OuterVolumeSpecName: "inventory") pod "e7629752-891c-46ea-b8ca-4156789f68b3" (UID: "e7629752-891c-46ea-b8ca-4156789f68b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.359603 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qbt6\" (UniqueName: \"kubernetes.io/projected/e7629752-891c-46ea-b8ca-4156789f68b3-kube-api-access-7qbt6\") on node \"crc\" DevicePath \"\"" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.359639 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.359652 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7629752-891c-46ea-b8ca-4156789f68b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.732383 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" event={"ID":"e7629752-891c-46ea-b8ca-4156789f68b3","Type":"ContainerDied","Data":"e8450efec29b48cb5e68acb7a27d3f65aea94dee15ebe9eec52ed995c2378c20"} Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.732476 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8450efec29b48cb5e68acb7a27d3f65aea94dee15ebe9eec52ed995c2378c20" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.732417 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx7fh" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.808435 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45"] Jan 30 21:51:07 crc kubenswrapper[4834]: E0130 21:51:07.809133 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7629752-891c-46ea-b8ca-4156789f68b3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.809213 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7629752-891c-46ea-b8ca-4156789f68b3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.809472 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7629752-891c-46ea-b8ca-4156789f68b3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.810198 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.814234 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.814622 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.814773 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.815651 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.821886 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45"] Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.971504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.971595 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:07 crc kubenswrapper[4834]: I0130 21:51:07.971679 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb9kk\" (UniqueName: \"kubernetes.io/projected/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-kube-api-access-nb9kk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.073505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.073681 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb9kk\" (UniqueName: \"kubernetes.io/projected/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-kube-api-access-nb9kk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.073780 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.078570 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.078863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.094624 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb9kk\" (UniqueName: \"kubernetes.io/projected/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-kube-api-access-nb9kk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mfc45\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.166029 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:51:08 crc kubenswrapper[4834]: I0130 21:51:08.779644 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45"] Jan 30 21:51:09 crc kubenswrapper[4834]: I0130 21:51:09.748275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" event={"ID":"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5","Type":"ContainerStarted","Data":"620a553673825596a15b53d28e110ee9a90e510d1f827c6f9ed080c99fdf1c67"} Jan 30 21:51:10 crc kubenswrapper[4834]: I0130 21:51:10.764495 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" event={"ID":"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5","Type":"ContainerStarted","Data":"dc33233dbad3e94817dfc74f4829ba991ee0b1221bc289a1fa1ac387855b6900"} Jan 30 21:51:10 crc kubenswrapper[4834]: I0130 21:51:10.786353 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" podStartSLOduration=2.636323543 podStartE2EDuration="3.786328915s" podCreationTimestamp="2026-01-30 21:51:07 +0000 UTC" firstStartedPulling="2026-01-30 21:51:08.782618797 +0000 UTC m=+2119.935764935" lastFinishedPulling="2026-01-30 21:51:09.932624169 +0000 UTC m=+2121.085770307" observedRunningTime="2026-01-30 21:51:10.783906596 +0000 UTC m=+2121.937052734" watchObservedRunningTime="2026-01-30 21:51:10.786328915 +0000 UTC m=+2121.939475053" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.194876 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8n47l"] Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.197980 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.217227 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8n47l"] Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.353468 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-catalog-content\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.353535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnp9t\" (UniqueName: \"kubernetes.io/projected/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-kube-api-access-xnp9t\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.353592 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-utilities\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.455828 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnp9t\" (UniqueName: \"kubernetes.io/projected/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-kube-api-access-xnp9t\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.455908 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-utilities\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.456215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-catalog-content\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.456496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-utilities\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.456621 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-catalog-content\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.490329 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnp9t\" (UniqueName: \"kubernetes.io/projected/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-kube-api-access-xnp9t\") pod \"redhat-operators-8n47l\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:53 crc kubenswrapper[4834]: I0130 21:51:53.520702 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:51:54 crc kubenswrapper[4834]: I0130 21:51:54.057826 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8n47l"] Jan 30 21:51:54 crc kubenswrapper[4834]: I0130 21:51:54.168445 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerStarted","Data":"26df83de45359f5e6d910e916fd66ee8560cbb41858d9581e0a47bb719438747"} Jan 30 21:51:55 crc kubenswrapper[4834]: I0130 21:51:55.180218 4834 generic.go:334] "Generic (PLEG): container finished" podID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerID="8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5" exitCode=0 Jan 30 21:51:55 crc kubenswrapper[4834]: I0130 21:51:55.180344 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerDied","Data":"8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5"} Jan 30 21:51:57 crc kubenswrapper[4834]: I0130 21:51:57.205171 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerStarted","Data":"b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509"} Jan 30 21:51:58 crc kubenswrapper[4834]: I0130 21:51:58.215339 4834 generic.go:334] "Generic (PLEG): container finished" podID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerID="b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509" exitCode=0 Jan 30 21:51:58 crc kubenswrapper[4834]: I0130 21:51:58.215386 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerDied","Data":"b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509"} Jan 30 21:52:00 crc kubenswrapper[4834]: I0130 21:52:00.243068 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" containerID="dc33233dbad3e94817dfc74f4829ba991ee0b1221bc289a1fa1ac387855b6900" exitCode=0 Jan 30 21:52:00 crc kubenswrapper[4834]: I0130 21:52:00.243178 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" event={"ID":"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5","Type":"ContainerDied","Data":"dc33233dbad3e94817dfc74f4829ba991ee0b1221bc289a1fa1ac387855b6900"} Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.254491 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerStarted","Data":"73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60"} Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.286309 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8n47l" podStartSLOduration=2.852385651 podStartE2EDuration="8.286287202s" podCreationTimestamp="2026-01-30 21:51:53 +0000 UTC" firstStartedPulling="2026-01-30 21:51:55.181816887 +0000 UTC m=+2166.334963035" lastFinishedPulling="2026-01-30 21:52:00.615718448 +0000 UTC m=+2171.768864586" observedRunningTime="2026-01-30 21:52:01.279448999 +0000 UTC m=+2172.432595197" watchObservedRunningTime="2026-01-30 21:52:01.286287202 +0000 UTC m=+2172.439433340" Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.842149 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.973027 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-inventory\") pod \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.973141 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb9kk\" (UniqueName: \"kubernetes.io/projected/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-kube-api-access-nb9kk\") pod \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.973214 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-ssh-key-openstack-edpm-ipam\") pod \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\" (UID: \"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5\") " Jan 30 21:52:01 crc kubenswrapper[4834]: I0130 21:52:01.978769 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-kube-api-access-nb9kk" (OuterVolumeSpecName: "kube-api-access-nb9kk") pod "b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" (UID: "b0d0b92f-1774-4f57-8d8d-df4a1228e2d5"). InnerVolumeSpecName "kube-api-access-nb9kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.001988 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" (UID: "b0d0b92f-1774-4f57-8d8d-df4a1228e2d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.002890 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-inventory" (OuterVolumeSpecName: "inventory") pod "b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" (UID: "b0d0b92f-1774-4f57-8d8d-df4a1228e2d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.075206 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb9kk\" (UniqueName: \"kubernetes.io/projected/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-kube-api-access-nb9kk\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.075238 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.075248 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0d0b92f-1774-4f57-8d8d-df4a1228e2d5-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.264189 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.264186 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mfc45" event={"ID":"b0d0b92f-1774-4f57-8d8d-df4a1228e2d5","Type":"ContainerDied","Data":"620a553673825596a15b53d28e110ee9a90e510d1f827c6f9ed080c99fdf1c67"} Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.264256 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620a553673825596a15b53d28e110ee9a90e510d1f827c6f9ed080c99fdf1c67" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.377176 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zwv5h"] Jan 30 21:52:02 crc kubenswrapper[4834]: E0130 21:52:02.377928 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.377960 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.378293 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0b92f-1774-4f57-8d8d-df4a1228e2d5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.379223 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.384304 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.384504 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.385026 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.385144 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.387447 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zwv5h"] Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.482554 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.482860 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmzd\" (UniqueName: \"kubernetes.io/projected/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-kube-api-access-tnmzd\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.483111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.585248 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.585354 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.585409 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmzd\" (UniqueName: \"kubernetes.io/projected/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-kube-api-access-tnmzd\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.590157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.595856 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.601670 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmzd\" (UniqueName: \"kubernetes.io/projected/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-kube-api-access-tnmzd\") pod \"ssh-known-hosts-edpm-deployment-zwv5h\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:02 crc kubenswrapper[4834]: I0130 21:52:02.709643 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:03 crc kubenswrapper[4834]: I0130 21:52:03.370343 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-zwv5h"] Jan 30 21:52:03 crc kubenswrapper[4834]: I0130 21:52:03.521576 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:52:03 crc kubenswrapper[4834]: I0130 21:52:03.522043 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:52:04 crc kubenswrapper[4834]: I0130 21:52:04.283244 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" event={"ID":"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d","Type":"ContainerStarted","Data":"0901a32dcc4b4b7568edc72d056c2b0599045caf85765906abfc4a55b0b3de34"} Jan 30 21:52:04 crc kubenswrapper[4834]: I0130 21:52:04.283631 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" event={"ID":"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d","Type":"ContainerStarted","Data":"5f42011d2bc53d356add9a2315042cca8b5450bafc1126efdd733e4a6f11a9f7"} Jan 30 21:52:04 crc kubenswrapper[4834]: I0130 21:52:04.305236 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" podStartSLOduration=1.87687539 podStartE2EDuration="2.305217135s" podCreationTimestamp="2026-01-30 21:52:02 +0000 UTC" firstStartedPulling="2026-01-30 21:52:03.394367636 +0000 UTC m=+2174.547513774" lastFinishedPulling="2026-01-30 21:52:03.822709361 +0000 UTC m=+2174.975855519" observedRunningTime="2026-01-30 21:52:04.301470119 +0000 UTC m=+2175.454616257" watchObservedRunningTime="2026-01-30 21:52:04.305217135 +0000 UTC m=+2175.458363263" Jan 30 21:52:04 crc kubenswrapper[4834]: I0130 21:52:04.572047 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8n47l" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="registry-server" probeResult="failure" output=< Jan 30 21:52:04 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 21:52:04 crc kubenswrapper[4834]: > Jan 30 21:52:11 crc kubenswrapper[4834]: I0130 21:52:11.345682 4834 generic.go:334] "Generic (PLEG): container finished" podID="dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" containerID="0901a32dcc4b4b7568edc72d056c2b0599045caf85765906abfc4a55b0b3de34" exitCode=0 Jan 30 21:52:11 crc kubenswrapper[4834]: I0130 21:52:11.345762 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" event={"ID":"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d","Type":"ContainerDied","Data":"0901a32dcc4b4b7568edc72d056c2b0599045caf85765906abfc4a55b0b3de34"} Jan 30 21:52:12 crc kubenswrapper[4834]: I0130 21:52:12.856310 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:12.999943 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmzd\" (UniqueName: \"kubernetes.io/projected/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-kube-api-access-tnmzd\") pod \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.000497 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-inventory-0\") pod \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.000734 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-ssh-key-openstack-edpm-ipam\") pod \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\" (UID: \"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d\") " Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.006612 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-kube-api-access-tnmzd" (OuterVolumeSpecName: "kube-api-access-tnmzd") pod "dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" (UID: "dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d"). InnerVolumeSpecName "kube-api-access-tnmzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.029887 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" (UID: "dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.036659 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" (UID: "dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.103167 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.103197 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmzd\" (UniqueName: \"kubernetes.io/projected/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-kube-api-access-tnmzd\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.103206 4834 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.368012 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" event={"ID":"dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d","Type":"ContainerDied","Data":"5f42011d2bc53d356add9a2315042cca8b5450bafc1126efdd733e4a6f11a9f7"} Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.368122 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f42011d2bc53d356add9a2315042cca8b5450bafc1126efdd733e4a6f11a9f7" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.368315 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-zwv5h" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.453693 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl"] Jan 30 21:52:13 crc kubenswrapper[4834]: E0130 21:52:13.454237 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" containerName="ssh-known-hosts-edpm-deployment" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.454253 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" containerName="ssh-known-hosts-edpm-deployment" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.454499 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d" containerName="ssh-known-hosts-edpm-deployment" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.455226 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.459269 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.459415 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.460490 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.460546 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.472874 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl"] Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.509616 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.509754 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.509826 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wjnn\" (UniqueName: \"kubernetes.io/projected/1309ebf1-3a18-4898-8137-e3658586a506-kube-api-access-2wjnn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.577488 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.611657 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.611763 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.611807 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wjnn\" (UniqueName: \"kubernetes.io/projected/1309ebf1-3a18-4898-8137-e3658586a506-kube-api-access-2wjnn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.616509 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.627631 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.628090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.628934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wjnn\" (UniqueName: \"kubernetes.io/projected/1309ebf1-3a18-4898-8137-e3658586a506-kube-api-access-2wjnn\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-42nbl\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.784278 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:13 crc kubenswrapper[4834]: I0130 21:52:13.822135 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8n47l"] Jan 30 21:52:14 crc kubenswrapper[4834]: I0130 21:52:14.312818 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl"] Jan 30 21:52:14 crc kubenswrapper[4834]: I0130 21:52:14.380175 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" event={"ID":"1309ebf1-3a18-4898-8137-e3658586a506","Type":"ContainerStarted","Data":"6fe6c6e732f86a5b9d852ceca6480bcd7bace7e950ed0ef757df9344e37a7d3f"} Jan 30 21:52:15 crc kubenswrapper[4834]: I0130 21:52:15.392633 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8n47l" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="registry-server" containerID="cri-o://73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60" gracePeriod=2 Jan 30 21:52:15 crc kubenswrapper[4834]: I0130 21:52:15.963420 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.064494 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-catalog-content\") pod \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.064559 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnp9t\" (UniqueName: \"kubernetes.io/projected/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-kube-api-access-xnp9t\") pod \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.064634 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-utilities\") pod \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\" (UID: \"2183f416-a8bc-4d42-915a-aa8e3dffe7e9\") " Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.066336 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-utilities" (OuterVolumeSpecName: "utilities") pod "2183f416-a8bc-4d42-915a-aa8e3dffe7e9" (UID: "2183f416-a8bc-4d42-915a-aa8e3dffe7e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.076340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-kube-api-access-xnp9t" (OuterVolumeSpecName: "kube-api-access-xnp9t") pod "2183f416-a8bc-4d42-915a-aa8e3dffe7e9" (UID: "2183f416-a8bc-4d42-915a-aa8e3dffe7e9"). InnerVolumeSpecName "kube-api-access-xnp9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.166629 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnp9t\" (UniqueName: \"kubernetes.io/projected/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-kube-api-access-xnp9t\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.166665 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.178571 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2183f416-a8bc-4d42-915a-aa8e3dffe7e9" (UID: "2183f416-a8bc-4d42-915a-aa8e3dffe7e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.268209 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2183f416-a8bc-4d42-915a-aa8e3dffe7e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.401958 4834 generic.go:334] "Generic (PLEG): container finished" podID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerID="73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60" exitCode=0 Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.402085 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n47l" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.402237 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerDied","Data":"73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60"} Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.402269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n47l" event={"ID":"2183f416-a8bc-4d42-915a-aa8e3dffe7e9","Type":"ContainerDied","Data":"26df83de45359f5e6d910e916fd66ee8560cbb41858d9581e0a47bb719438747"} Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.402288 4834 scope.go:117] "RemoveContainer" containerID="73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.404238 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" event={"ID":"1309ebf1-3a18-4898-8137-e3658586a506","Type":"ContainerStarted","Data":"3e74c246380cc16a36e9959a04dd0e516b457f8bf035f22d39e6a56d3c4f4f20"} Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.428077 4834 scope.go:117] "RemoveContainer" containerID="b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.435507 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" podStartSLOduration=2.772432073 podStartE2EDuration="3.435488976s" podCreationTimestamp="2026-01-30 21:52:13 +0000 UTC" firstStartedPulling="2026-01-30 21:52:14.315022052 +0000 UTC m=+2185.468168190" lastFinishedPulling="2026-01-30 21:52:14.978078955 +0000 UTC m=+2186.131225093" observedRunningTime="2026-01-30 21:52:16.427653884 +0000 UTC m=+2187.580800032" watchObservedRunningTime="2026-01-30 21:52:16.435488976 +0000 UTC m=+2187.588635104" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.456609 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8n47l"] Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.458288 4834 scope.go:117] "RemoveContainer" containerID="8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.466808 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8n47l"] Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.500953 4834 scope.go:117] "RemoveContainer" containerID="73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60" Jan 30 21:52:16 crc kubenswrapper[4834]: E0130 21:52:16.501411 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60\": container with ID starting with 73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60 not found: ID does not exist" containerID="73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.501551 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60"} err="failed to get container status \"73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60\": rpc error: code = NotFound desc = could not find container \"73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60\": container with ID starting with 73c904779c822f9f51d1216e6f44438364948a099d951d16ae9eff123d35ac60 not found: ID does not exist" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.501690 4834 scope.go:117] "RemoveContainer" containerID="b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509" Jan 30 21:52:16 crc kubenswrapper[4834]: E0130 21:52:16.502255 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509\": container with ID starting with b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509 not found: ID does not exist" containerID="b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.502303 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509"} err="failed to get container status \"b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509\": rpc error: code = NotFound desc = could not find container \"b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509\": container with ID starting with b2391c0668b9994fecfe48c2dcba6d95d242240c741a7a95bb10564b6008f509 not found: ID does not exist" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.502320 4834 scope.go:117] "RemoveContainer" containerID="8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5" Jan 30 21:52:16 crc kubenswrapper[4834]: E0130 21:52:16.502808 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5\": container with ID starting with 8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5 not found: ID does not exist" containerID="8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5" Jan 30 21:52:16 crc kubenswrapper[4834]: I0130 21:52:16.502852 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5"} err="failed to get container status \"8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5\": rpc error: code = NotFound desc = could not find container \"8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5\": container with ID starting with 8570bd5fe44c1c3a5551bfdf376cefc4adbed966c16d281a5718b6edcefe87b5 not found: ID does not exist" Jan 30 21:52:17 crc kubenswrapper[4834]: I0130 21:52:17.553761 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" path="/var/lib/kubelet/pods/2183f416-a8bc-4d42-915a-aa8e3dffe7e9/volumes" Jan 30 21:52:23 crc kubenswrapper[4834]: I0130 21:52:23.473169 4834 generic.go:334] "Generic (PLEG): container finished" podID="1309ebf1-3a18-4898-8137-e3658586a506" containerID="3e74c246380cc16a36e9959a04dd0e516b457f8bf035f22d39e6a56d3c4f4f20" exitCode=0 Jan 30 21:52:23 crc kubenswrapper[4834]: I0130 21:52:23.473290 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" event={"ID":"1309ebf1-3a18-4898-8137-e3658586a506","Type":"ContainerDied","Data":"3e74c246380cc16a36e9959a04dd0e516b457f8bf035f22d39e6a56d3c4f4f20"} Jan 30 21:52:24 crc kubenswrapper[4834]: I0130 21:52:24.927592 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.035860 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-ssh-key-openstack-edpm-ipam\") pod \"1309ebf1-3a18-4898-8137-e3658586a506\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.036189 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-inventory\") pod \"1309ebf1-3a18-4898-8137-e3658586a506\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.036480 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wjnn\" (UniqueName: \"kubernetes.io/projected/1309ebf1-3a18-4898-8137-e3658586a506-kube-api-access-2wjnn\") pod \"1309ebf1-3a18-4898-8137-e3658586a506\" (UID: \"1309ebf1-3a18-4898-8137-e3658586a506\") " Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.040894 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1309ebf1-3a18-4898-8137-e3658586a506-kube-api-access-2wjnn" (OuterVolumeSpecName: "kube-api-access-2wjnn") pod "1309ebf1-3a18-4898-8137-e3658586a506" (UID: "1309ebf1-3a18-4898-8137-e3658586a506"). InnerVolumeSpecName "kube-api-access-2wjnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.062315 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-inventory" (OuterVolumeSpecName: "inventory") pod "1309ebf1-3a18-4898-8137-e3658586a506" (UID: "1309ebf1-3a18-4898-8137-e3658586a506"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.075837 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1309ebf1-3a18-4898-8137-e3658586a506" (UID: "1309ebf1-3a18-4898-8137-e3658586a506"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.140136 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.140199 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1309ebf1-3a18-4898-8137-e3658586a506-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.140229 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wjnn\" (UniqueName: \"kubernetes.io/projected/1309ebf1-3a18-4898-8137-e3658586a506-kube-api-access-2wjnn\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.491608 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" event={"ID":"1309ebf1-3a18-4898-8137-e3658586a506","Type":"ContainerDied","Data":"6fe6c6e732f86a5b9d852ceca6480bcd7bace7e950ed0ef757df9344e37a7d3f"} Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.491880 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe6c6e732f86a5b9d852ceca6480bcd7bace7e950ed0ef757df9344e37a7d3f" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.491646 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-42nbl" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.567103 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s"] Jan 30 21:52:25 crc kubenswrapper[4834]: E0130 21:52:25.567792 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="extract-content" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.567895 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="extract-content" Jan 30 21:52:25 crc kubenswrapper[4834]: E0130 21:52:25.567983 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1309ebf1-3a18-4898-8137-e3658586a506" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.568094 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1309ebf1-3a18-4898-8137-e3658586a506" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:25 crc kubenswrapper[4834]: E0130 21:52:25.568224 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="registry-server" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.568308 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="registry-server" Jan 30 21:52:25 crc kubenswrapper[4834]: E0130 21:52:25.568389 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="extract-utilities" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.568491 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="extract-utilities" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.568804 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2183f416-a8bc-4d42-915a-aa8e3dffe7e9" containerName="registry-server" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.568926 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1309ebf1-3a18-4898-8137-e3658586a506" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.570135 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.573006 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.573809 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.574134 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.576958 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.578212 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s"] Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.751150 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.751307 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmls\" (UniqueName: \"kubernetes.io/projected/a0352665-2322-46ae-b019-d20ccc580414-kube-api-access-fvmls\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.751621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.854197 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.854321 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmls\" (UniqueName: \"kubernetes.io/projected/a0352665-2322-46ae-b019-d20ccc580414-kube-api-access-fvmls\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.854429 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.860137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.860147 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.872849 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmls\" (UniqueName: \"kubernetes.io/projected/a0352665-2322-46ae-b019-d20ccc580414-kube-api-access-fvmls\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:25 crc kubenswrapper[4834]: I0130 21:52:25.887218 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:26 crc kubenswrapper[4834]: I0130 21:52:26.427609 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s"] Jan 30 21:52:26 crc kubenswrapper[4834]: I0130 21:52:26.500916 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" event={"ID":"a0352665-2322-46ae-b019-d20ccc580414","Type":"ContainerStarted","Data":"56ef373aef9ee88699492bfe82f2deed70553d17e9555e2e75ea18514d884338"} Jan 30 21:52:27 crc kubenswrapper[4834]: I0130 21:52:27.510211 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" event={"ID":"a0352665-2322-46ae-b019-d20ccc580414","Type":"ContainerStarted","Data":"4aed1894bead263611c1c92cfcc3682639c8c95de1bfd1363a0143f60f7c0213"} Jan 30 21:52:27 crc kubenswrapper[4834]: I0130 21:52:27.528568 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" podStartSLOduration=1.991076405 podStartE2EDuration="2.52855141s" podCreationTimestamp="2026-01-30 21:52:25 +0000 UTC" firstStartedPulling="2026-01-30 21:52:26.430447305 +0000 UTC m=+2197.583593443" lastFinishedPulling="2026-01-30 21:52:26.9679223 +0000 UTC m=+2198.121068448" observedRunningTime="2026-01-30 21:52:27.523461466 +0000 UTC m=+2198.676607634" watchObservedRunningTime="2026-01-30 21:52:27.52855141 +0000 UTC m=+2198.681697548" Jan 30 21:52:34 crc kubenswrapper[4834]: I0130 21:52:34.160854 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:52:34 crc kubenswrapper[4834]: I0130 21:52:34.161449 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:52:36 crc kubenswrapper[4834]: I0130 21:52:36.599606 4834 generic.go:334] "Generic (PLEG): container finished" podID="a0352665-2322-46ae-b019-d20ccc580414" containerID="4aed1894bead263611c1c92cfcc3682639c8c95de1bfd1363a0143f60f7c0213" exitCode=0 Jan 30 21:52:36 crc kubenswrapper[4834]: I0130 21:52:36.599690 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" event={"ID":"a0352665-2322-46ae-b019-d20ccc580414","Type":"ContainerDied","Data":"4aed1894bead263611c1c92cfcc3682639c8c95de1bfd1363a0143f60f7c0213"} Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.048702 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.121171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-inventory\") pod \"a0352665-2322-46ae-b019-d20ccc580414\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.121246 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmls\" (UniqueName: \"kubernetes.io/projected/a0352665-2322-46ae-b019-d20ccc580414-kube-api-access-fvmls\") pod \"a0352665-2322-46ae-b019-d20ccc580414\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.121534 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-ssh-key-openstack-edpm-ipam\") pod \"a0352665-2322-46ae-b019-d20ccc580414\" (UID: \"a0352665-2322-46ae-b019-d20ccc580414\") " Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.129161 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0352665-2322-46ae-b019-d20ccc580414-kube-api-access-fvmls" (OuterVolumeSpecName: "kube-api-access-fvmls") pod "a0352665-2322-46ae-b019-d20ccc580414" (UID: "a0352665-2322-46ae-b019-d20ccc580414"). InnerVolumeSpecName "kube-api-access-fvmls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.150942 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0352665-2322-46ae-b019-d20ccc580414" (UID: "a0352665-2322-46ae-b019-d20ccc580414"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.154454 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-inventory" (OuterVolumeSpecName: "inventory") pod "a0352665-2322-46ae-b019-d20ccc580414" (UID: "a0352665-2322-46ae-b019-d20ccc580414"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.223797 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.223837 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0352665-2322-46ae-b019-d20ccc580414-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.223849 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmls\" (UniqueName: \"kubernetes.io/projected/a0352665-2322-46ae-b019-d20ccc580414-kube-api-access-fvmls\") on node \"crc\" DevicePath \"\"" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.620556 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" event={"ID":"a0352665-2322-46ae-b019-d20ccc580414","Type":"ContainerDied","Data":"56ef373aef9ee88699492bfe82f2deed70553d17e9555e2e75ea18514d884338"} Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.621118 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56ef373aef9ee88699492bfe82f2deed70553d17e9555e2e75ea18514d884338" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.620654 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.722835 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv"] Jan 30 21:52:38 crc kubenswrapper[4834]: E0130 21:52:38.723308 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0352665-2322-46ae-b019-d20ccc580414" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.723331 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0352665-2322-46ae-b019-d20ccc580414" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.723593 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0352665-2322-46ae-b019-d20ccc580414" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.724450 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.728098 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.728150 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.728160 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.728300 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.728935 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.729332 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.731729 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.731731 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.737294 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv"] Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.835780 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836045 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836079 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836110 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836188 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836253 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836288 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836318 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836336 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836372 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836449 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836467 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836499 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.836517 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvcsp\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-kube-api-access-nvcsp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937572 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937623 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937657 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937676 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937698 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937742 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937758 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937779 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937797 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvcsp\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-kube-api-access-nvcsp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937842 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937893 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937920 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.937963 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.942406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.942463 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.943594 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.943676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.951297 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.951410 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.951689 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.952572 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.952591 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.953135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.953981 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.954456 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.954600 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:38 crc kubenswrapper[4834]: I0130 21:52:38.956505 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvcsp\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-kube-api-access-nvcsp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:39 crc kubenswrapper[4834]: I0130 21:52:39.044797 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:52:39 crc kubenswrapper[4834]: W0130 21:52:39.566064 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee0fe911_9c60_4565_9f57_b3d8efcd1aa3.slice/crio-799f4e583214dd477609faf569f6ee56677332bd128eb849d367e77730b9d3b0 WatchSource:0}: Error finding container 799f4e583214dd477609faf569f6ee56677332bd128eb849d367e77730b9d3b0: Status 404 returned error can't find the container with id 799f4e583214dd477609faf569f6ee56677332bd128eb849d367e77730b9d3b0 Jan 30 21:52:39 crc kubenswrapper[4834]: I0130 21:52:39.568995 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv"] Jan 30 21:52:39 crc kubenswrapper[4834]: I0130 21:52:39.631959 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" event={"ID":"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3","Type":"ContainerStarted","Data":"799f4e583214dd477609faf569f6ee56677332bd128eb849d367e77730b9d3b0"} Jan 30 21:52:40 crc kubenswrapper[4834]: I0130 21:52:40.641489 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" event={"ID":"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3","Type":"ContainerStarted","Data":"84defecf3f5c2275a171a53e3462d70d18660a810a30ba3ae8bc289fd0c34ede"} Jan 30 21:52:40 crc kubenswrapper[4834]: I0130 21:52:40.666855 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" podStartSLOduration=2.240725169 podStartE2EDuration="2.666837841s" podCreationTimestamp="2026-01-30 21:52:38 +0000 UTC" firstStartedPulling="2026-01-30 21:52:39.568314203 +0000 UTC m=+2210.721460351" lastFinishedPulling="2026-01-30 21:52:39.994426885 +0000 UTC m=+2211.147573023" observedRunningTime="2026-01-30 21:52:40.660209424 +0000 UTC m=+2211.813355572" watchObservedRunningTime="2026-01-30 21:52:40.666837841 +0000 UTC m=+2211.819983979" Jan 30 21:53:04 crc kubenswrapper[4834]: I0130 21:53:04.161522 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:53:04 crc kubenswrapper[4834]: I0130 21:53:04.162198 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:53:18 crc kubenswrapper[4834]: I0130 21:53:18.009932 4834 generic.go:334] "Generic (PLEG): container finished" podID="ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" containerID="84defecf3f5c2275a171a53e3462d70d18660a810a30ba3ae8bc289fd0c34ede" exitCode=0 Jan 30 21:53:18 crc kubenswrapper[4834]: I0130 21:53:18.010056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" event={"ID":"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3","Type":"ContainerDied","Data":"84defecf3f5c2275a171a53e3462d70d18660a810a30ba3ae8bc289fd0c34ede"} Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.438367 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.555829 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-repo-setup-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.555946 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ovn-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.555987 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-inventory\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556017 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvcsp\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-kube-api-access-nvcsp\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556053 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556112 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556149 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ssh-key-openstack-edpm-ipam\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556172 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-nova-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556206 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-bootstrap-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556258 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556328 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-telemetry-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556411 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556581 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-libvirt-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.556645 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-neutron-metadata-combined-ca-bundle\") pod \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\" (UID: \"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3\") " Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.566359 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.567183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.567413 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.568100 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.571464 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.571877 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.572846 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.573639 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-kube-api-access-nvcsp" (OuterVolumeSpecName: "kube-api-access-nvcsp") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "kube-api-access-nvcsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.573776 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.574201 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.574981 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.575618 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.610173 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.613549 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-inventory" (OuterVolumeSpecName: "inventory") pod "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" (UID: "ee0fe911-9c60-4565-9f57-b3d8efcd1aa3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662728 4834 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662767 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662777 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662786 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvcsp\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-kube-api-access-nvcsp\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662795 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662805 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662815 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662824 4834 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662834 4834 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662842 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662852 4834 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662861 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662871 4834 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:19 crc kubenswrapper[4834]: I0130 21:53:19.662879 4834 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee0fe911-9c60-4565-9f57-b3d8efcd1aa3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.030466 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" event={"ID":"ee0fe911-9c60-4565-9f57-b3d8efcd1aa3","Type":"ContainerDied","Data":"799f4e583214dd477609faf569f6ee56677332bd128eb849d367e77730b9d3b0"} Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.030842 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799f4e583214dd477609faf569f6ee56677332bd128eb849d367e77730b9d3b0" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.030570 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.135763 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc"] Jan 30 21:53:20 crc kubenswrapper[4834]: E0130 21:53:20.136214 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.136236 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.136514 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0fe911-9c60-4565-9f57-b3d8efcd1aa3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.137301 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.139048 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.139382 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.139746 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.139898 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.140362 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.145832 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc"] Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.275534 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.275608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.275659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/112c7019-6e0b-4366-b959-e11750f43a26-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.275694 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkph\" (UniqueName: \"kubernetes.io/projected/112c7019-6e0b-4366-b959-e11750f43a26-kube-api-access-gjkph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.275718 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.377328 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkph\" (UniqueName: \"kubernetes.io/projected/112c7019-6e0b-4366-b959-e11750f43a26-kube-api-access-gjkph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.377679 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.377913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.378583 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.378784 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/112c7019-6e0b-4366-b959-e11750f43a26-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.380292 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/112c7019-6e0b-4366-b959-e11750f43a26-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.382124 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.387995 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.392001 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.395581 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkph\" (UniqueName: \"kubernetes.io/projected/112c7019-6e0b-4366-b959-e11750f43a26-kube-api-access-gjkph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hljjc\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:20 crc kubenswrapper[4834]: I0130 21:53:20.466175 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:53:21 crc kubenswrapper[4834]: I0130 21:53:21.016717 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc"] Jan 30 21:53:21 crc kubenswrapper[4834]: I0130 21:53:21.039444 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" event={"ID":"112c7019-6e0b-4366-b959-e11750f43a26","Type":"ContainerStarted","Data":"86385b50db8edec2cfff67185789368e104b9a19995eacbc10b45b01aa173120"} Jan 30 21:53:22 crc kubenswrapper[4834]: I0130 21:53:22.051523 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" event={"ID":"112c7019-6e0b-4366-b959-e11750f43a26","Type":"ContainerStarted","Data":"be5f1e90c26064ade6f9f0770c20e16b634114053051d04879e13619e329fbe0"} Jan 30 21:53:22 crc kubenswrapper[4834]: I0130 21:53:22.078489 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" podStartSLOduration=1.615379115 podStartE2EDuration="2.078466701s" podCreationTimestamp="2026-01-30 21:53:20 +0000 UTC" firstStartedPulling="2026-01-30 21:53:21.029849401 +0000 UTC m=+2252.182995539" lastFinishedPulling="2026-01-30 21:53:21.492936977 +0000 UTC m=+2252.646083125" observedRunningTime="2026-01-30 21:53:22.067792229 +0000 UTC m=+2253.220938367" watchObservedRunningTime="2026-01-30 21:53:22.078466701 +0000 UTC m=+2253.231612849" Jan 30 21:53:34 crc kubenswrapper[4834]: I0130 21:53:34.160977 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 21:53:34 crc kubenswrapper[4834]: I0130 21:53:34.163091 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 21:53:34 crc kubenswrapper[4834]: I0130 21:53:34.163219 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 21:53:34 crc kubenswrapper[4834]: I0130 21:53:34.164299 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 21:53:34 crc kubenswrapper[4834]: I0130 21:53:34.164493 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" gracePeriod=600 Jan 30 21:53:34 crc kubenswrapper[4834]: E0130 21:53:34.293979 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:53:35 crc kubenswrapper[4834]: I0130 21:53:35.183115 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" exitCode=0 Jan 30 21:53:35 crc kubenswrapper[4834]: I0130 21:53:35.183183 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87"} Jan 30 21:53:35 crc kubenswrapper[4834]: I0130 21:53:35.183532 4834 scope.go:117] "RemoveContainer" containerID="3ecbc7678f260ea9142896ba0b31a8f28c33db200a7138d092326369289802d4" Jan 30 21:53:35 crc kubenswrapper[4834]: I0130 21:53:35.184201 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:53:35 crc kubenswrapper[4834]: E0130 21:53:35.184813 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:53:48 crc kubenswrapper[4834]: I0130 21:53:48.530499 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:53:48 crc kubenswrapper[4834]: E0130 21:53:48.531132 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:53:59 crc kubenswrapper[4834]: I0130 21:53:59.541442 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:53:59 crc kubenswrapper[4834]: E0130 21:53:59.542735 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:54:11 crc kubenswrapper[4834]: I0130 21:54:11.531785 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:54:11 crc kubenswrapper[4834]: E0130 21:54:11.533087 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:54:24 crc kubenswrapper[4834]: I0130 21:54:24.643311 4834 generic.go:334] "Generic (PLEG): container finished" podID="112c7019-6e0b-4366-b959-e11750f43a26" containerID="be5f1e90c26064ade6f9f0770c20e16b634114053051d04879e13619e329fbe0" exitCode=0 Jan 30 21:54:24 crc kubenswrapper[4834]: I0130 21:54:24.643942 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" event={"ID":"112c7019-6e0b-4366-b959-e11750f43a26","Type":"ContainerDied","Data":"be5f1e90c26064ade6f9f0770c20e16b634114053051d04879e13619e329fbe0"} Jan 30 21:54:25 crc kubenswrapper[4834]: I0130 21:54:25.531698 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:54:25 crc kubenswrapper[4834]: E0130 21:54:25.532056 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.110920 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.206759 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ssh-key-openstack-edpm-ipam\") pod \"112c7019-6e0b-4366-b959-e11750f43a26\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.206840 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjkph\" (UniqueName: \"kubernetes.io/projected/112c7019-6e0b-4366-b959-e11750f43a26-kube-api-access-gjkph\") pod \"112c7019-6e0b-4366-b959-e11750f43a26\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.206931 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/112c7019-6e0b-4366-b959-e11750f43a26-ovncontroller-config-0\") pod \"112c7019-6e0b-4366-b959-e11750f43a26\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.206961 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-inventory\") pod \"112c7019-6e0b-4366-b959-e11750f43a26\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.207054 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ovn-combined-ca-bundle\") pod \"112c7019-6e0b-4366-b959-e11750f43a26\" (UID: \"112c7019-6e0b-4366-b959-e11750f43a26\") " Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.212647 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112c7019-6e0b-4366-b959-e11750f43a26-kube-api-access-gjkph" (OuterVolumeSpecName: "kube-api-access-gjkph") pod "112c7019-6e0b-4366-b959-e11750f43a26" (UID: "112c7019-6e0b-4366-b959-e11750f43a26"). InnerVolumeSpecName "kube-api-access-gjkph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.213006 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "112c7019-6e0b-4366-b959-e11750f43a26" (UID: "112c7019-6e0b-4366-b959-e11750f43a26"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.237034 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/112c7019-6e0b-4366-b959-e11750f43a26-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "112c7019-6e0b-4366-b959-e11750f43a26" (UID: "112c7019-6e0b-4366-b959-e11750f43a26"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.237037 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-inventory" (OuterVolumeSpecName: "inventory") pod "112c7019-6e0b-4366-b959-e11750f43a26" (UID: "112c7019-6e0b-4366-b959-e11750f43a26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.241591 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "112c7019-6e0b-4366-b959-e11750f43a26" (UID: "112c7019-6e0b-4366-b959-e11750f43a26"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.310879 4834 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/112c7019-6e0b-4366-b959-e11750f43a26-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.311113 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.311125 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.311134 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/112c7019-6e0b-4366-b959-e11750f43a26-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.311142 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjkph\" (UniqueName: \"kubernetes.io/projected/112c7019-6e0b-4366-b959-e11750f43a26-kube-api-access-gjkph\") on node \"crc\" DevicePath \"\"" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.664412 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" event={"ID":"112c7019-6e0b-4366-b959-e11750f43a26","Type":"ContainerDied","Data":"86385b50db8edec2cfff67185789368e104b9a19995eacbc10b45b01aa173120"} Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.664457 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86385b50db8edec2cfff67185789368e104b9a19995eacbc10b45b01aa173120" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.664520 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hljjc" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.855666 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6"] Jan 30 21:54:26 crc kubenswrapper[4834]: E0130 21:54:26.856133 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112c7019-6e0b-4366-b959-e11750f43a26" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.856149 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="112c7019-6e0b-4366-b959-e11750f43a26" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.856340 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="112c7019-6e0b-4366-b959-e11750f43a26" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.857006 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.859314 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.859693 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.859734 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.859751 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.860326 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.861232 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:54:26 crc kubenswrapper[4834]: I0130 21:54:26.872257 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6"] Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.024204 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.024292 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.024471 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.024565 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.024621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6d8\" (UniqueName: \"kubernetes.io/projected/83c863d4-f624-4968-92ff-c6e8bd697115-kube-api-access-pq6d8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.024713 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.127222 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.127305 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.127349 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.127472 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.127561 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.127589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6d8\" (UniqueName: \"kubernetes.io/projected/83c863d4-f624-4968-92ff-c6e8bd697115-kube-api-access-pq6d8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.132411 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.132614 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.132821 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.134679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.135171 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.150187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6d8\" (UniqueName: \"kubernetes.io/projected/83c863d4-f624-4968-92ff-c6e8bd697115-kube-api-access-pq6d8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.184694 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:54:27 crc kubenswrapper[4834]: I0130 21:54:27.715285 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6"] Jan 30 21:54:28 crc kubenswrapper[4834]: I0130 21:54:28.681931 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" event={"ID":"83c863d4-f624-4968-92ff-c6e8bd697115","Type":"ContainerStarted","Data":"86f57d63ed5e003f8006da1764b487101f4850d9631966e5cac51a90645e1d6b"} Jan 30 21:54:29 crc kubenswrapper[4834]: I0130 21:54:29.690629 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" event={"ID":"83c863d4-f624-4968-92ff-c6e8bd697115","Type":"ContainerStarted","Data":"734a95c8f7e46f8b5c7d24cd4f3e142aaf1a2e79201c03cbfc8bda0f3ead1a59"} Jan 30 21:54:29 crc kubenswrapper[4834]: I0130 21:54:29.713785 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" podStartSLOduration=2.221815924 podStartE2EDuration="3.713766247s" podCreationTimestamp="2026-01-30 21:54:26 +0000 UTC" firstStartedPulling="2026-01-30 21:54:27.727544079 +0000 UTC m=+2318.880690217" lastFinishedPulling="2026-01-30 21:54:29.219494402 +0000 UTC m=+2320.372640540" observedRunningTime="2026-01-30 21:54:29.703739014 +0000 UTC m=+2320.856885152" watchObservedRunningTime="2026-01-30 21:54:29.713766247 +0000 UTC m=+2320.866912385" Jan 30 21:54:39 crc kubenswrapper[4834]: I0130 21:54:39.539812 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:54:39 crc kubenswrapper[4834]: E0130 21:54:39.540686 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:54:52 crc kubenswrapper[4834]: I0130 21:54:52.532110 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:54:52 crc kubenswrapper[4834]: E0130 21:54:52.532923 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:55:04 crc kubenswrapper[4834]: I0130 21:55:04.532455 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:55:04 crc kubenswrapper[4834]: E0130 21:55:04.533387 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:55:17 crc kubenswrapper[4834]: I0130 21:55:17.142657 4834 generic.go:334] "Generic (PLEG): container finished" podID="83c863d4-f624-4968-92ff-c6e8bd697115" containerID="734a95c8f7e46f8b5c7d24cd4f3e142aaf1a2e79201c03cbfc8bda0f3ead1a59" exitCode=0 Jan 30 21:55:17 crc kubenswrapper[4834]: I0130 21:55:17.142730 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" event={"ID":"83c863d4-f624-4968-92ff-c6e8bd697115","Type":"ContainerDied","Data":"734a95c8f7e46f8b5c7d24cd4f3e142aaf1a2e79201c03cbfc8bda0f3ead1a59"} Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.531453 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:55:18 crc kubenswrapper[4834]: E0130 21:55:18.531955 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.577358 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.646072 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq6d8\" (UniqueName: \"kubernetes.io/projected/83c863d4-f624-4968-92ff-c6e8bd697115-kube-api-access-pq6d8\") pod \"83c863d4-f624-4968-92ff-c6e8bd697115\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.646154 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-metadata-combined-ca-bundle\") pod \"83c863d4-f624-4968-92ff-c6e8bd697115\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.646212 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-ssh-key-openstack-edpm-ipam\") pod \"83c863d4-f624-4968-92ff-c6e8bd697115\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.646254 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-inventory\") pod \"83c863d4-f624-4968-92ff-c6e8bd697115\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.646527 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-ovn-metadata-agent-neutron-config-0\") pod \"83c863d4-f624-4968-92ff-c6e8bd697115\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.646570 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-nova-metadata-neutron-config-0\") pod \"83c863d4-f624-4968-92ff-c6e8bd697115\" (UID: \"83c863d4-f624-4968-92ff-c6e8bd697115\") " Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.670026 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "83c863d4-f624-4968-92ff-c6e8bd697115" (UID: "83c863d4-f624-4968-92ff-c6e8bd697115"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.670111 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c863d4-f624-4968-92ff-c6e8bd697115-kube-api-access-pq6d8" (OuterVolumeSpecName: "kube-api-access-pq6d8") pod "83c863d4-f624-4968-92ff-c6e8bd697115" (UID: "83c863d4-f624-4968-92ff-c6e8bd697115"). InnerVolumeSpecName "kube-api-access-pq6d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.675556 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "83c863d4-f624-4968-92ff-c6e8bd697115" (UID: "83c863d4-f624-4968-92ff-c6e8bd697115"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.679565 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "83c863d4-f624-4968-92ff-c6e8bd697115" (UID: "83c863d4-f624-4968-92ff-c6e8bd697115"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.682380 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-inventory" (OuterVolumeSpecName: "inventory") pod "83c863d4-f624-4968-92ff-c6e8bd697115" (UID: "83c863d4-f624-4968-92ff-c6e8bd697115"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.701698 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "83c863d4-f624-4968-92ff-c6e8bd697115" (UID: "83c863d4-f624-4968-92ff-c6e8bd697115"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.749993 4834 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.750031 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.750044 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq6d8\" (UniqueName: \"kubernetes.io/projected/83c863d4-f624-4968-92ff-c6e8bd697115-kube-api-access-pq6d8\") on node \"crc\" DevicePath \"\"" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.750057 4834 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.750070 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:55:18 crc kubenswrapper[4834]: I0130 21:55:18.750084 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/83c863d4-f624-4968-92ff-c6e8bd697115-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.161543 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" event={"ID":"83c863d4-f624-4968-92ff-c6e8bd697115","Type":"ContainerDied","Data":"86f57d63ed5e003f8006da1764b487101f4850d9631966e5cac51a90645e1d6b"} Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.161593 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86f57d63ed5e003f8006da1764b487101f4850d9631966e5cac51a90645e1d6b" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.161954 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.268897 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl"] Jan 30 21:55:19 crc kubenswrapper[4834]: E0130 21:55:19.269324 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c863d4-f624-4968-92ff-c6e8bd697115" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.269342 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c863d4-f624-4968-92ff-c6e8bd697115" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.269566 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c863d4-f624-4968-92ff-c6e8bd697115" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.270302 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.271988 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.272747 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.273066 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.273220 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.273264 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.279652 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl"] Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.363581 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.363688 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87js5\" (UniqueName: \"kubernetes.io/projected/0ca23ae2-7ce2-414a-8d68-41008397be4a-kube-api-access-87js5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.363999 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.364139 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.364220 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.466449 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.466533 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.466571 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.466612 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.466670 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87js5\" (UniqueName: \"kubernetes.io/projected/0ca23ae2-7ce2-414a-8d68-41008397be4a-kube-api-access-87js5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.471865 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.472697 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.480632 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.488905 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.490318 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87js5\" (UniqueName: \"kubernetes.io/projected/0ca23ae2-7ce2-414a-8d68-41008397be4a-kube-api-access-87js5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:19 crc kubenswrapper[4834]: I0130 21:55:19.645090 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:55:20 crc kubenswrapper[4834]: I0130 21:55:20.159697 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl"] Jan 30 21:55:20 crc kubenswrapper[4834]: I0130 21:55:20.174698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" event={"ID":"0ca23ae2-7ce2-414a-8d68-41008397be4a","Type":"ContainerStarted","Data":"6b18e3a53248ead8d36fec86430034d2905be47ffaf384255852938a1b18840f"} Jan 30 21:55:21 crc kubenswrapper[4834]: I0130 21:55:21.186751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" event={"ID":"0ca23ae2-7ce2-414a-8d68-41008397be4a","Type":"ContainerStarted","Data":"d4c423e77b5b5c562a8074efcd2a4c530d1cb60b5634790cbbaedfd859a1ca4a"} Jan 30 21:55:21 crc kubenswrapper[4834]: I0130 21:55:21.208924 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" podStartSLOduration=1.5491576710000001 podStartE2EDuration="2.208906829s" podCreationTimestamp="2026-01-30 21:55:19 +0000 UTC" firstStartedPulling="2026-01-30 21:55:20.162129614 +0000 UTC m=+2371.315275752" lastFinishedPulling="2026-01-30 21:55:20.821878752 +0000 UTC m=+2371.975024910" observedRunningTime="2026-01-30 21:55:21.201320055 +0000 UTC m=+2372.354466193" watchObservedRunningTime="2026-01-30 21:55:21.208906829 +0000 UTC m=+2372.362052967" Jan 30 21:55:32 crc kubenswrapper[4834]: I0130 21:55:32.531123 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:55:32 crc kubenswrapper[4834]: E0130 21:55:32.532165 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:55:45 crc kubenswrapper[4834]: I0130 21:55:45.531722 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:55:45 crc kubenswrapper[4834]: E0130 21:55:45.532582 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:55:53 crc kubenswrapper[4834]: I0130 21:55:53.875200 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vdz8q"] Jan 30 21:55:53 crc kubenswrapper[4834]: I0130 21:55:53.880525 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:53 crc kubenswrapper[4834]: I0130 21:55:53.893703 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vdz8q"] Jan 30 21:55:53 crc kubenswrapper[4834]: I0130 21:55:53.906123 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-catalog-content\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:53 crc kubenswrapper[4834]: I0130 21:55:53.906179 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwxl\" (UniqueName: \"kubernetes.io/projected/7581f595-2b65-4322-a664-726a32f00617-kube-api-access-pnwxl\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:53 crc kubenswrapper[4834]: I0130 21:55:53.906296 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-utilities\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.007525 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-catalog-content\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.007572 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwxl\" (UniqueName: \"kubernetes.io/projected/7581f595-2b65-4322-a664-726a32f00617-kube-api-access-pnwxl\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.007640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-utilities\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.008004 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-catalog-content\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.008049 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-utilities\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.060656 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwxl\" (UniqueName: \"kubernetes.io/projected/7581f595-2b65-4322-a664-726a32f00617-kube-api-access-pnwxl\") pod \"community-operators-vdz8q\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.201010 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:55:54 crc kubenswrapper[4834]: I0130 21:55:54.777179 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vdz8q"] Jan 30 21:55:55 crc kubenswrapper[4834]: I0130 21:55:55.518708 4834 generic.go:334] "Generic (PLEG): container finished" podID="7581f595-2b65-4322-a664-726a32f00617" containerID="94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296" exitCode=0 Jan 30 21:55:55 crc kubenswrapper[4834]: I0130 21:55:55.518800 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerDied","Data":"94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296"} Jan 30 21:55:55 crc kubenswrapper[4834]: I0130 21:55:55.518982 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerStarted","Data":"a6974918a76cb91abfa715a6f602c12b823a2b095c14c0420c3aef879e9c4835"} Jan 30 21:55:55 crc kubenswrapper[4834]: I0130 21:55:55.522930 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 21:55:57 crc kubenswrapper[4834]: I0130 21:55:57.544981 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerStarted","Data":"4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236"} Jan 30 21:55:58 crc kubenswrapper[4834]: I0130 21:55:58.532144 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:55:58 crc kubenswrapper[4834]: E0130 21:55:58.532888 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:55:58 crc kubenswrapper[4834]: I0130 21:55:58.555462 4834 generic.go:334] "Generic (PLEG): container finished" podID="7581f595-2b65-4322-a664-726a32f00617" containerID="4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236" exitCode=0 Jan 30 21:55:58 crc kubenswrapper[4834]: I0130 21:55:58.555507 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerDied","Data":"4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236"} Jan 30 21:55:59 crc kubenswrapper[4834]: I0130 21:55:59.567588 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerStarted","Data":"f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2"} Jan 30 21:55:59 crc kubenswrapper[4834]: I0130 21:55:59.601161 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vdz8q" podStartSLOduration=3.043970121 podStartE2EDuration="6.601134383s" podCreationTimestamp="2026-01-30 21:55:53 +0000 UTC" firstStartedPulling="2026-01-30 21:55:55.522554039 +0000 UTC m=+2406.675700177" lastFinishedPulling="2026-01-30 21:55:59.079718301 +0000 UTC m=+2410.232864439" observedRunningTime="2026-01-30 21:55:59.589942857 +0000 UTC m=+2410.743088995" watchObservedRunningTime="2026-01-30 21:55:59.601134383 +0000 UTC m=+2410.754280521" Jan 30 21:56:04 crc kubenswrapper[4834]: I0130 21:56:04.201660 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:56:04 crc kubenswrapper[4834]: I0130 21:56:04.203913 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:56:04 crc kubenswrapper[4834]: I0130 21:56:04.247961 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:56:04 crc kubenswrapper[4834]: I0130 21:56:04.661304 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:56:04 crc kubenswrapper[4834]: I0130 21:56:04.711878 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vdz8q"] Jan 30 21:56:06 crc kubenswrapper[4834]: I0130 21:56:06.635911 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vdz8q" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="registry-server" containerID="cri-o://f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2" gracePeriod=2 Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.158364 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.270380 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-utilities\") pod \"7581f595-2b65-4322-a664-726a32f00617\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.270479 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-catalog-content\") pod \"7581f595-2b65-4322-a664-726a32f00617\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.270545 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnwxl\" (UniqueName: \"kubernetes.io/projected/7581f595-2b65-4322-a664-726a32f00617-kube-api-access-pnwxl\") pod \"7581f595-2b65-4322-a664-726a32f00617\" (UID: \"7581f595-2b65-4322-a664-726a32f00617\") " Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.271567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-utilities" (OuterVolumeSpecName: "utilities") pod "7581f595-2b65-4322-a664-726a32f00617" (UID: "7581f595-2b65-4322-a664-726a32f00617"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.277035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7581f595-2b65-4322-a664-726a32f00617-kube-api-access-pnwxl" (OuterVolumeSpecName: "kube-api-access-pnwxl") pod "7581f595-2b65-4322-a664-726a32f00617" (UID: "7581f595-2b65-4322-a664-726a32f00617"). InnerVolumeSpecName "kube-api-access-pnwxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.373517 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnwxl\" (UniqueName: \"kubernetes.io/projected/7581f595-2b65-4322-a664-726a32f00617-kube-api-access-pnwxl\") on node \"crc\" DevicePath \"\"" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.373566 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.646224 4834 generic.go:334] "Generic (PLEG): container finished" podID="7581f595-2b65-4322-a664-726a32f00617" containerID="f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2" exitCode=0 Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.646269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerDied","Data":"f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2"} Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.646301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vdz8q" event={"ID":"7581f595-2b65-4322-a664-726a32f00617","Type":"ContainerDied","Data":"a6974918a76cb91abfa715a6f602c12b823a2b095c14c0420c3aef879e9c4835"} Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.646318 4834 scope.go:117] "RemoveContainer" containerID="f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.647236 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vdz8q" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.663958 4834 scope.go:117] "RemoveContainer" containerID="4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.694029 4834 scope.go:117] "RemoveContainer" containerID="94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.738298 4834 scope.go:117] "RemoveContainer" containerID="f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2" Jan 30 21:56:07 crc kubenswrapper[4834]: E0130 21:56:07.738847 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2\": container with ID starting with f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2 not found: ID does not exist" containerID="f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.738887 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2"} err="failed to get container status \"f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2\": rpc error: code = NotFound desc = could not find container \"f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2\": container with ID starting with f853e8b95be22e44076a347fccd4f1acfc6878c8cf4645116039d750ad77bfa2 not found: ID does not exist" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.738914 4834 scope.go:117] "RemoveContainer" containerID="4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236" Jan 30 21:56:07 crc kubenswrapper[4834]: E0130 21:56:07.739325 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236\": container with ID starting with 4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236 not found: ID does not exist" containerID="4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.739350 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236"} err="failed to get container status \"4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236\": rpc error: code = NotFound desc = could not find container \"4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236\": container with ID starting with 4c0a68146fc5f0efd211aa386f7d64e031bbc15ccd46e15a50f2aa506d1ad236 not found: ID does not exist" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.739365 4834 scope.go:117] "RemoveContainer" containerID="94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296" Jan 30 21:56:07 crc kubenswrapper[4834]: E0130 21:56:07.739862 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296\": container with ID starting with 94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296 not found: ID does not exist" containerID="94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296" Jan 30 21:56:07 crc kubenswrapper[4834]: I0130 21:56:07.739898 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296"} err="failed to get container status \"94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296\": rpc error: code = NotFound desc = could not find container \"94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296\": container with ID starting with 94bf878e3441719e2f63fa910f676b3ed1da8592a141feaa543217d01c666296 not found: ID does not exist" Jan 30 21:56:08 crc kubenswrapper[4834]: I0130 21:56:08.071865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7581f595-2b65-4322-a664-726a32f00617" (UID: "7581f595-2b65-4322-a664-726a32f00617"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:56:08 crc kubenswrapper[4834]: I0130 21:56:08.090377 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7581f595-2b65-4322-a664-726a32f00617-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:56:08 crc kubenswrapper[4834]: I0130 21:56:08.284359 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vdz8q"] Jan 30 21:56:08 crc kubenswrapper[4834]: I0130 21:56:08.292488 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vdz8q"] Jan 30 21:56:09 crc kubenswrapper[4834]: I0130 21:56:09.548941 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7581f595-2b65-4322-a664-726a32f00617" path="/var/lib/kubelet/pods/7581f595-2b65-4322-a664-726a32f00617/volumes" Jan 30 21:56:12 crc kubenswrapper[4834]: I0130 21:56:12.531060 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:56:12 crc kubenswrapper[4834]: E0130 21:56:12.531527 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:56:24 crc kubenswrapper[4834]: I0130 21:56:24.531576 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:56:24 crc kubenswrapper[4834]: E0130 21:56:24.532186 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:56:37 crc kubenswrapper[4834]: I0130 21:56:37.530588 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:56:37 crc kubenswrapper[4834]: E0130 21:56:37.531554 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.755301 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hj8nv"] Jan 30 21:56:41 crc kubenswrapper[4834]: E0130 21:56:41.756289 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="registry-server" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.756305 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="registry-server" Jan 30 21:56:41 crc kubenswrapper[4834]: E0130 21:56:41.756323 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="extract-utilities" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.756332 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="extract-utilities" Jan 30 21:56:41 crc kubenswrapper[4834]: E0130 21:56:41.756382 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="extract-content" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.756391 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="extract-content" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.756666 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7581f595-2b65-4322-a664-726a32f00617" containerName="registry-server" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.758447 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.773763 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj8nv"] Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.812038 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xfs\" (UniqueName: \"kubernetes.io/projected/4b30002f-3437-4c4e-b7a0-5774efbe5072-kube-api-access-w8xfs\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.812182 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-catalog-content\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.812230 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-utilities\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.913926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-catalog-content\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.914243 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-utilities\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.914348 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xfs\" (UniqueName: \"kubernetes.io/projected/4b30002f-3437-4c4e-b7a0-5774efbe5072-kube-api-access-w8xfs\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.914434 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-catalog-content\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.914733 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-utilities\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:41 crc kubenswrapper[4834]: I0130 21:56:41.935742 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xfs\" (UniqueName: \"kubernetes.io/projected/4b30002f-3437-4c4e-b7a0-5774efbe5072-kube-api-access-w8xfs\") pod \"redhat-marketplace-hj8nv\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:42 crc kubenswrapper[4834]: I0130 21:56:42.079941 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:42 crc kubenswrapper[4834]: I0130 21:56:42.572079 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj8nv"] Jan 30 21:56:42 crc kubenswrapper[4834]: I0130 21:56:42.971199 4834 generic.go:334] "Generic (PLEG): container finished" podID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerID="814e4641c7e44cdec29bef1a1e0f59d9c0c912f99715b04aa6b1d4e2dbd674b5" exitCode=0 Jan 30 21:56:42 crc kubenswrapper[4834]: I0130 21:56:42.971241 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerDied","Data":"814e4641c7e44cdec29bef1a1e0f59d9c0c912f99715b04aa6b1d4e2dbd674b5"} Jan 30 21:56:42 crc kubenswrapper[4834]: I0130 21:56:42.972523 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerStarted","Data":"8fea86474587d0576c6ca01bc27516bc501d7bbc85913e080a4ddb9e20616285"} Jan 30 21:56:44 crc kubenswrapper[4834]: I0130 21:56:44.990618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerStarted","Data":"a2cf3f193b2af2cb1d3ccf1d5b6225ff9cc3cbca2f7dc8d143010e6815d4579e"} Jan 30 21:56:46 crc kubenswrapper[4834]: I0130 21:56:46.003324 4834 generic.go:334] "Generic (PLEG): container finished" podID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerID="a2cf3f193b2af2cb1d3ccf1d5b6225ff9cc3cbca2f7dc8d143010e6815d4579e" exitCode=0 Jan 30 21:56:46 crc kubenswrapper[4834]: I0130 21:56:46.003410 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerDied","Data":"a2cf3f193b2af2cb1d3ccf1d5b6225ff9cc3cbca2f7dc8d143010e6815d4579e"} Jan 30 21:56:47 crc kubenswrapper[4834]: I0130 21:56:47.014341 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerStarted","Data":"b1babb52543fe618601978565f277ae9bdc7315ae4f439bd27ab08932f36286b"} Jan 30 21:56:47 crc kubenswrapper[4834]: I0130 21:56:47.041349 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hj8nv" podStartSLOduration=2.5399478589999998 podStartE2EDuration="6.041323026s" podCreationTimestamp="2026-01-30 21:56:41 +0000 UTC" firstStartedPulling="2026-01-30 21:56:42.972914389 +0000 UTC m=+2454.126060527" lastFinishedPulling="2026-01-30 21:56:46.474289546 +0000 UTC m=+2457.627435694" observedRunningTime="2026-01-30 21:56:47.03011914 +0000 UTC m=+2458.183265288" watchObservedRunningTime="2026-01-30 21:56:47.041323026 +0000 UTC m=+2458.194469174" Jan 30 21:56:51 crc kubenswrapper[4834]: I0130 21:56:51.531902 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:56:51 crc kubenswrapper[4834]: E0130 21:56:51.532787 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:56:52 crc kubenswrapper[4834]: I0130 21:56:52.081539 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:52 crc kubenswrapper[4834]: I0130 21:56:52.081893 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:52 crc kubenswrapper[4834]: I0130 21:56:52.162204 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:53 crc kubenswrapper[4834]: I0130 21:56:53.217627 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:53 crc kubenswrapper[4834]: I0130 21:56:53.283061 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj8nv"] Jan 30 21:56:55 crc kubenswrapper[4834]: I0130 21:56:55.099916 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hj8nv" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="registry-server" containerID="cri-o://b1babb52543fe618601978565f277ae9bdc7315ae4f439bd27ab08932f36286b" gracePeriod=2 Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.114041 4834 generic.go:334] "Generic (PLEG): container finished" podID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerID="b1babb52543fe618601978565f277ae9bdc7315ae4f439bd27ab08932f36286b" exitCode=0 Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.114110 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerDied","Data":"b1babb52543fe618601978565f277ae9bdc7315ae4f439bd27ab08932f36286b"} Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.114543 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hj8nv" event={"ID":"4b30002f-3437-4c4e-b7a0-5774efbe5072","Type":"ContainerDied","Data":"8fea86474587d0576c6ca01bc27516bc501d7bbc85913e080a4ddb9e20616285"} Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.114558 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fea86474587d0576c6ca01bc27516bc501d7bbc85913e080a4ddb9e20616285" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.133236 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.259261 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8xfs\" (UniqueName: \"kubernetes.io/projected/4b30002f-3437-4c4e-b7a0-5774efbe5072-kube-api-access-w8xfs\") pod \"4b30002f-3437-4c4e-b7a0-5774efbe5072\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.259469 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-utilities\") pod \"4b30002f-3437-4c4e-b7a0-5774efbe5072\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.259518 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-catalog-content\") pod \"4b30002f-3437-4c4e-b7a0-5774efbe5072\" (UID: \"4b30002f-3437-4c4e-b7a0-5774efbe5072\") " Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.260766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-utilities" (OuterVolumeSpecName: "utilities") pod "4b30002f-3437-4c4e-b7a0-5774efbe5072" (UID: "4b30002f-3437-4c4e-b7a0-5774efbe5072"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.265312 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b30002f-3437-4c4e-b7a0-5774efbe5072-kube-api-access-w8xfs" (OuterVolumeSpecName: "kube-api-access-w8xfs") pod "4b30002f-3437-4c4e-b7a0-5774efbe5072" (UID: "4b30002f-3437-4c4e-b7a0-5774efbe5072"). InnerVolumeSpecName "kube-api-access-w8xfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.362211 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.362269 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8xfs\" (UniqueName: \"kubernetes.io/projected/4b30002f-3437-4c4e-b7a0-5774efbe5072-kube-api-access-w8xfs\") on node \"crc\" DevicePath \"\"" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.853223 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b30002f-3437-4c4e-b7a0-5774efbe5072" (UID: "4b30002f-3437-4c4e-b7a0-5774efbe5072"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 21:56:56 crc kubenswrapper[4834]: I0130 21:56:56.873583 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b30002f-3437-4c4e-b7a0-5774efbe5072-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 21:56:57 crc kubenswrapper[4834]: I0130 21:56:57.125893 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hj8nv" Jan 30 21:56:57 crc kubenswrapper[4834]: I0130 21:56:57.166941 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj8nv"] Jan 30 21:56:57 crc kubenswrapper[4834]: I0130 21:56:57.179774 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hj8nv"] Jan 30 21:56:57 crc kubenswrapper[4834]: I0130 21:56:57.543657 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" path="/var/lib/kubelet/pods/4b30002f-3437-4c4e-b7a0-5774efbe5072/volumes" Jan 30 21:57:06 crc kubenswrapper[4834]: I0130 21:57:06.530573 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:57:06 crc kubenswrapper[4834]: E0130 21:57:06.531242 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:57:19 crc kubenswrapper[4834]: I0130 21:57:19.544695 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:57:19 crc kubenswrapper[4834]: E0130 21:57:19.545782 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:57:31 crc kubenswrapper[4834]: I0130 21:57:31.532190 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:57:31 crc kubenswrapper[4834]: E0130 21:57:31.533003 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:57:44 crc kubenswrapper[4834]: I0130 21:57:44.531345 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:57:44 crc kubenswrapper[4834]: E0130 21:57:44.532090 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:57:59 crc kubenswrapper[4834]: I0130 21:57:59.542261 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:57:59 crc kubenswrapper[4834]: E0130 21:57:59.543531 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:58:13 crc kubenswrapper[4834]: I0130 21:58:13.532563 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:58:13 crc kubenswrapper[4834]: E0130 21:58:13.533186 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:58:25 crc kubenswrapper[4834]: I0130 21:58:25.533514 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:58:25 crc kubenswrapper[4834]: E0130 21:58:25.534295 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 21:58:39 crc kubenswrapper[4834]: I0130 21:58:39.538317 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 21:58:40 crc kubenswrapper[4834]: I0130 21:58:40.610631 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"1bf310a9d0e5abd60888ccf207fce82c1cbc19301cf50c7335e1336ee9e1f0bd"} Jan 30 21:59:20 crc kubenswrapper[4834]: I0130 21:59:20.696899 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ca23ae2-7ce2-414a-8d68-41008397be4a" containerID="d4c423e77b5b5c562a8074efcd2a4c530d1cb60b5634790cbbaedfd859a1ca4a" exitCode=0 Jan 30 21:59:20 crc kubenswrapper[4834]: I0130 21:59:20.697034 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" event={"ID":"0ca23ae2-7ce2-414a-8d68-41008397be4a","Type":"ContainerDied","Data":"d4c423e77b5b5c562a8074efcd2a4c530d1cb60b5634790cbbaedfd859a1ca4a"} Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.210127 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.351755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87js5\" (UniqueName: \"kubernetes.io/projected/0ca23ae2-7ce2-414a-8d68-41008397be4a-kube-api-access-87js5\") pod \"0ca23ae2-7ce2-414a-8d68-41008397be4a\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.351816 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-secret-0\") pod \"0ca23ae2-7ce2-414a-8d68-41008397be4a\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.351851 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-inventory\") pod \"0ca23ae2-7ce2-414a-8d68-41008397be4a\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.351904 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-ssh-key-openstack-edpm-ipam\") pod \"0ca23ae2-7ce2-414a-8d68-41008397be4a\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.351961 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-combined-ca-bundle\") pod \"0ca23ae2-7ce2-414a-8d68-41008397be4a\" (UID: \"0ca23ae2-7ce2-414a-8d68-41008397be4a\") " Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.358094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0ca23ae2-7ce2-414a-8d68-41008397be4a" (UID: "0ca23ae2-7ce2-414a-8d68-41008397be4a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.358563 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca23ae2-7ce2-414a-8d68-41008397be4a-kube-api-access-87js5" (OuterVolumeSpecName: "kube-api-access-87js5") pod "0ca23ae2-7ce2-414a-8d68-41008397be4a" (UID: "0ca23ae2-7ce2-414a-8d68-41008397be4a"). InnerVolumeSpecName "kube-api-access-87js5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.381522 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0ca23ae2-7ce2-414a-8d68-41008397be4a" (UID: "0ca23ae2-7ce2-414a-8d68-41008397be4a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.383261 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0ca23ae2-7ce2-414a-8d68-41008397be4a" (UID: "0ca23ae2-7ce2-414a-8d68-41008397be4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.392288 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-inventory" (OuterVolumeSpecName: "inventory") pod "0ca23ae2-7ce2-414a-8d68-41008397be4a" (UID: "0ca23ae2-7ce2-414a-8d68-41008397be4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.454607 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87js5\" (UniqueName: \"kubernetes.io/projected/0ca23ae2-7ce2-414a-8d68-41008397be4a-kube-api-access-87js5\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.454647 4834 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.454661 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.454673 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.454685 4834 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca23ae2-7ce2-414a-8d68-41008397be4a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.718230 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" event={"ID":"0ca23ae2-7ce2-414a-8d68-41008397be4a","Type":"ContainerDied","Data":"6b18e3a53248ead8d36fec86430034d2905be47ffaf384255852938a1b18840f"} Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.718283 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b18e3a53248ead8d36fec86430034d2905be47ffaf384255852938a1b18840f" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.718283 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.853301 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc"] Jan 30 21:59:22 crc kubenswrapper[4834]: E0130 21:59:22.853727 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="extract-content" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.853747 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="extract-content" Jan 30 21:59:22 crc kubenswrapper[4834]: E0130 21:59:22.853767 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="extract-utilities" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.853774 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="extract-utilities" Jan 30 21:59:22 crc kubenswrapper[4834]: E0130 21:59:22.853791 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="registry-server" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.853798 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="registry-server" Jan 30 21:59:22 crc kubenswrapper[4834]: E0130 21:59:22.853813 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca23ae2-7ce2-414a-8d68-41008397be4a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.853820 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca23ae2-7ce2-414a-8d68-41008397be4a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.853984 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca23ae2-7ce2-414a-8d68-41008397be4a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.854008 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b30002f-3437-4c4e-b7a0-5774efbe5072" containerName="registry-server" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.854675 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.856745 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.857158 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.857336 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.857456 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.857529 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.857802 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.858627 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.872712 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc"] Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.965577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.965967 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.965998 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7dlz\" (UniqueName: \"kubernetes.io/projected/5f06d94f-e9f1-42b2-9978-b7a128728a60-kube-api-access-j7dlz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.966025 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.966563 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.966657 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.966895 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.967011 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:22 crc kubenswrapper[4834]: I0130 21:59:22.967065 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069019 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069093 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7dlz\" (UniqueName: \"kubernetes.io/projected/5f06d94f-e9f1-42b2-9978-b7a128728a60-kube-api-access-j7dlz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069146 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069218 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069234 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069270 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.069319 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.070168 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.074813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.074905 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.075258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.075299 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.076557 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.076567 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.080205 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.096944 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7dlz\" (UniqueName: \"kubernetes.io/projected/5f06d94f-e9f1-42b2-9978-b7a128728a60-kube-api-access-j7dlz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-2xqvc\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.194121 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 21:59:23 crc kubenswrapper[4834]: I0130 21:59:23.737247 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc"] Jan 30 21:59:24 crc kubenswrapper[4834]: I0130 21:59:24.740078 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" event={"ID":"5f06d94f-e9f1-42b2-9978-b7a128728a60","Type":"ContainerStarted","Data":"30ef7d8d400b58294b73602b449c10e76d133e8bcf23e19f059b6565560d3cd1"} Jan 30 21:59:24 crc kubenswrapper[4834]: I0130 21:59:24.740423 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" event={"ID":"5f06d94f-e9f1-42b2-9978-b7a128728a60","Type":"ContainerStarted","Data":"8a178ca2da1beaa831893e5c55174d0633c595c6dd01311aaf3ebb0eaca543cc"} Jan 30 21:59:24 crc kubenswrapper[4834]: I0130 21:59:24.768664 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" podStartSLOduration=2.3768237770000002 podStartE2EDuration="2.768637966s" podCreationTimestamp="2026-01-30 21:59:22 +0000 UTC" firstStartedPulling="2026-01-30 21:59:23.732327311 +0000 UTC m=+2614.885473459" lastFinishedPulling="2026-01-30 21:59:24.1241415 +0000 UTC m=+2615.277287648" observedRunningTime="2026-01-30 21:59:24.763552903 +0000 UTC m=+2615.916699061" watchObservedRunningTime="2026-01-30 21:59:24.768637966 +0000 UTC m=+2615.921784124" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.155870 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj"] Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.158127 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.161190 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.162622 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.166384 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj"] Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.186667 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2wr\" (UniqueName: \"kubernetes.io/projected/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-kube-api-access-rp2wr\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.186747 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-config-volume\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.186877 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-secret-volume\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.288876 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2wr\" (UniqueName: \"kubernetes.io/projected/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-kube-api-access-rp2wr\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.288940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-config-volume\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.288994 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-secret-volume\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.289858 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-config-volume\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.294743 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-secret-volume\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.312099 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2wr\" (UniqueName: \"kubernetes.io/projected/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-kube-api-access-rp2wr\") pod \"collect-profiles-29496840-5gdzj\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.486388 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:00 crc kubenswrapper[4834]: I0130 22:00:00.938811 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj"] Jan 30 22:00:01 crc kubenswrapper[4834]: I0130 22:00:01.102151 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" event={"ID":"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd","Type":"ContainerStarted","Data":"7410439a7778b0bc869209a11dcc7db31158cdf6badc1cdcfab537f538736205"} Jan 30 22:00:01 crc kubenswrapper[4834]: I0130 22:00:01.102199 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" event={"ID":"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd","Type":"ContainerStarted","Data":"b142502c7d1578ce6d8d1a52f29c933951341903a65c8cf6943b6ef584cfede1"} Jan 30 22:00:01 crc kubenswrapper[4834]: I0130 22:00:01.125725 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" podStartSLOduration=1.125708709 podStartE2EDuration="1.125708709s" podCreationTimestamp="2026-01-30 22:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 22:00:01.123493707 +0000 UTC m=+2652.276639865" watchObservedRunningTime="2026-01-30 22:00:01.125708709 +0000 UTC m=+2652.278854847" Jan 30 22:00:02 crc kubenswrapper[4834]: I0130 22:00:02.123805 4834 generic.go:334] "Generic (PLEG): container finished" podID="20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" containerID="7410439a7778b0bc869209a11dcc7db31158cdf6badc1cdcfab537f538736205" exitCode=0 Jan 30 22:00:02 crc kubenswrapper[4834]: I0130 22:00:02.123881 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" event={"ID":"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd","Type":"ContainerDied","Data":"7410439a7778b0bc869209a11dcc7db31158cdf6badc1cdcfab537f538736205"} Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.662902 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.761114 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-config-volume\") pod \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.761234 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp2wr\" (UniqueName: \"kubernetes.io/projected/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-kube-api-access-rp2wr\") pod \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.761390 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-secret-volume\") pod \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\" (UID: \"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd\") " Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.761893 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" (UID: "20fa6fa7-60f8-4ade-96e6-b0e3f77340bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.766662 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" (UID: "20fa6fa7-60f8-4ade-96e6-b0e3f77340bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.771727 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-kube-api-access-rp2wr" (OuterVolumeSpecName: "kube-api-access-rp2wr") pod "20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" (UID: "20fa6fa7-60f8-4ade-96e6-b0e3f77340bd"). InnerVolumeSpecName "kube-api-access-rp2wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.863879 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.864178 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp2wr\" (UniqueName: \"kubernetes.io/projected/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-kube-api-access-rp2wr\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:03 crc kubenswrapper[4834]: I0130 22:00:03.864254 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20fa6fa7-60f8-4ade-96e6-b0e3f77340bd-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:04 crc kubenswrapper[4834]: I0130 22:00:04.140909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" event={"ID":"20fa6fa7-60f8-4ade-96e6-b0e3f77340bd","Type":"ContainerDied","Data":"b142502c7d1578ce6d8d1a52f29c933951341903a65c8cf6943b6ef584cfede1"} Jan 30 22:00:04 crc kubenswrapper[4834]: I0130 22:00:04.140956 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b142502c7d1578ce6d8d1a52f29c933951341903a65c8cf6943b6ef584cfede1" Jan 30 22:00:04 crc kubenswrapper[4834]: I0130 22:00:04.141302 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496840-5gdzj" Jan 30 22:00:04 crc kubenswrapper[4834]: I0130 22:00:04.739418 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2"] Jan 30 22:00:04 crc kubenswrapper[4834]: I0130 22:00:04.747631 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496795-rhdv2"] Jan 30 22:00:05 crc kubenswrapper[4834]: I0130 22:00:05.547484 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3246e84-3def-488f-8a8f-069bdc3fa563" path="/var/lib/kubelet/pods/f3246e84-3def-488f-8a8f-069bdc3fa563/volumes" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.823359 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k78z8"] Jan 30 22:00:06 crc kubenswrapper[4834]: E0130 22:00:06.823924 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" containerName="collect-profiles" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.823940 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" containerName="collect-profiles" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.824246 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="20fa6fa7-60f8-4ade-96e6-b0e3f77340bd" containerName="collect-profiles" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.826098 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.842130 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k78z8"] Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.919372 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gwb\" (UniqueName: \"kubernetes.io/projected/b78a0559-d247-4005-9c8f-914106ee39d8-kube-api-access-27gwb\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.919592 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-catalog-content\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:06 crc kubenswrapper[4834]: I0130 22:00:06.919865 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-utilities\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.022451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gwb\" (UniqueName: \"kubernetes.io/projected/b78a0559-d247-4005-9c8f-914106ee39d8-kube-api-access-27gwb\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.022878 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-catalog-content\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.023096 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-utilities\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.023374 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-catalog-content\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.023597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-utilities\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.048985 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gwb\" (UniqueName: \"kubernetes.io/projected/b78a0559-d247-4005-9c8f-914106ee39d8-kube-api-access-27gwb\") pod \"certified-operators-k78z8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.154885 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:07 crc kubenswrapper[4834]: I0130 22:00:07.666574 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k78z8"] Jan 30 22:00:08 crc kubenswrapper[4834]: I0130 22:00:08.179358 4834 generic.go:334] "Generic (PLEG): container finished" podID="b78a0559-d247-4005-9c8f-914106ee39d8" containerID="2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885" exitCode=0 Jan 30 22:00:08 crc kubenswrapper[4834]: I0130 22:00:08.179460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerDied","Data":"2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885"} Jan 30 22:00:08 crc kubenswrapper[4834]: I0130 22:00:08.179638 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerStarted","Data":"25a01f6d8e5cab11fb7f6e80f7b7ce66660bcf88062ec830c827e2cf06745efc"} Jan 30 22:00:09 crc kubenswrapper[4834]: I0130 22:00:09.189049 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerStarted","Data":"7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd"} Jan 30 22:00:11 crc kubenswrapper[4834]: I0130 22:00:11.208820 4834 generic.go:334] "Generic (PLEG): container finished" podID="b78a0559-d247-4005-9c8f-914106ee39d8" containerID="7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd" exitCode=0 Jan 30 22:00:11 crc kubenswrapper[4834]: I0130 22:00:11.208915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerDied","Data":"7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd"} Jan 30 22:00:12 crc kubenswrapper[4834]: I0130 22:00:12.220441 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerStarted","Data":"f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4"} Jan 30 22:00:12 crc kubenswrapper[4834]: I0130 22:00:12.238600 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k78z8" podStartSLOduration=2.764028074 podStartE2EDuration="6.238579837s" podCreationTimestamp="2026-01-30 22:00:06 +0000 UTC" firstStartedPulling="2026-01-30 22:00:08.182896049 +0000 UTC m=+2659.336042227" lastFinishedPulling="2026-01-30 22:00:11.657447812 +0000 UTC m=+2662.810593990" observedRunningTime="2026-01-30 22:00:12.235361446 +0000 UTC m=+2663.388507584" watchObservedRunningTime="2026-01-30 22:00:12.238579837 +0000 UTC m=+2663.391725975" Jan 30 22:00:17 crc kubenswrapper[4834]: I0130 22:00:17.155281 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:17 crc kubenswrapper[4834]: I0130 22:00:17.155837 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:17 crc kubenswrapper[4834]: I0130 22:00:17.230208 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:17 crc kubenswrapper[4834]: I0130 22:00:17.323751 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:17 crc kubenswrapper[4834]: I0130 22:00:17.464204 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k78z8"] Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.279941 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k78z8" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="registry-server" containerID="cri-o://f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4" gracePeriod=2 Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.835698 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.977620 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gwb\" (UniqueName: \"kubernetes.io/projected/b78a0559-d247-4005-9c8f-914106ee39d8-kube-api-access-27gwb\") pod \"b78a0559-d247-4005-9c8f-914106ee39d8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.977991 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-utilities\") pod \"b78a0559-d247-4005-9c8f-914106ee39d8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.978165 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-catalog-content\") pod \"b78a0559-d247-4005-9c8f-914106ee39d8\" (UID: \"b78a0559-d247-4005-9c8f-914106ee39d8\") " Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.978629 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-utilities" (OuterVolumeSpecName: "utilities") pod "b78a0559-d247-4005-9c8f-914106ee39d8" (UID: "b78a0559-d247-4005-9c8f-914106ee39d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:00:19 crc kubenswrapper[4834]: I0130 22:00:19.987616 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b78a0559-d247-4005-9c8f-914106ee39d8-kube-api-access-27gwb" (OuterVolumeSpecName: "kube-api-access-27gwb") pod "b78a0559-d247-4005-9c8f-914106ee39d8" (UID: "b78a0559-d247-4005-9c8f-914106ee39d8"). InnerVolumeSpecName "kube-api-access-27gwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.024163 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b78a0559-d247-4005-9c8f-914106ee39d8" (UID: "b78a0559-d247-4005-9c8f-914106ee39d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.080567 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gwb\" (UniqueName: \"kubernetes.io/projected/b78a0559-d247-4005-9c8f-914106ee39d8-kube-api-access-27gwb\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.080595 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.080608 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b78a0559-d247-4005-9c8f-914106ee39d8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.295245 4834 generic.go:334] "Generic (PLEG): container finished" podID="b78a0559-d247-4005-9c8f-914106ee39d8" containerID="f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4" exitCode=0 Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.295294 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerDied","Data":"f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4"} Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.295323 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k78z8" event={"ID":"b78a0559-d247-4005-9c8f-914106ee39d8","Type":"ContainerDied","Data":"25a01f6d8e5cab11fb7f6e80f7b7ce66660bcf88062ec830c827e2cf06745efc"} Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.295352 4834 scope.go:117] "RemoveContainer" containerID="f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.295371 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k78z8" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.323666 4834 scope.go:117] "RemoveContainer" containerID="7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.349454 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k78z8"] Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.366739 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k78z8"] Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.376656 4834 scope.go:117] "RemoveContainer" containerID="2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.424704 4834 scope.go:117] "RemoveContainer" containerID="f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4" Jan 30 22:00:20 crc kubenswrapper[4834]: E0130 22:00:20.425176 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4\": container with ID starting with f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4 not found: ID does not exist" containerID="f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.425225 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4"} err="failed to get container status \"f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4\": rpc error: code = NotFound desc = could not find container \"f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4\": container with ID starting with f1e77f58b427e63c7ffa45a065bfbcb3b351b9fa46db13bf655bcc43a99316f4 not found: ID does not exist" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.425257 4834 scope.go:117] "RemoveContainer" containerID="7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd" Jan 30 22:00:20 crc kubenswrapper[4834]: E0130 22:00:20.425633 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd\": container with ID starting with 7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd not found: ID does not exist" containerID="7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.425710 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd"} err="failed to get container status \"7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd\": rpc error: code = NotFound desc = could not find container \"7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd\": container with ID starting with 7b4fc92bc35dbef62ab9d94a2338529693b43f11a2c9b2c3de5e76471be44dcd not found: ID does not exist" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.425760 4834 scope.go:117] "RemoveContainer" containerID="2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885" Jan 30 22:00:20 crc kubenswrapper[4834]: E0130 22:00:20.426220 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885\": container with ID starting with 2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885 not found: ID does not exist" containerID="2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885" Jan 30 22:00:20 crc kubenswrapper[4834]: I0130 22:00:20.426250 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885"} err="failed to get container status \"2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885\": rpc error: code = NotFound desc = could not find container \"2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885\": container with ID starting with 2290d7e236fca93ddc241a54d787822a4f4fbb520545516fbdc5a63b02d37885 not found: ID does not exist" Jan 30 22:00:21 crc kubenswrapper[4834]: I0130 22:00:21.542943 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" path="/var/lib/kubelet/pods/b78a0559-d247-4005-9c8f-914106ee39d8/volumes" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.163329 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29496841-n62kw"] Jan 30 22:01:00 crc kubenswrapper[4834]: E0130 22:01:00.165036 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="registry-server" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.165121 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="registry-server" Jan 30 22:01:00 crc kubenswrapper[4834]: E0130 22:01:00.165183 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="extract-content" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.165244 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="extract-content" Jan 30 22:01:00 crc kubenswrapper[4834]: E0130 22:01:00.165327 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="extract-utilities" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.165386 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="extract-utilities" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.165691 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b78a0559-d247-4005-9c8f-914106ee39d8" containerName="registry-server" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.166448 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.176014 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496841-n62kw"] Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.251235 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-config-data\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.251301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-combined-ca-bundle\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.251341 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-fernet-keys\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.251925 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6snv\" (UniqueName: \"kubernetes.io/projected/6d5e7575-f4f8-446e-bc52-08c48fe1806c-kube-api-access-s6snv\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.353756 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-combined-ca-bundle\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.353836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-fernet-keys\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.354007 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6snv\" (UniqueName: \"kubernetes.io/projected/6d5e7575-f4f8-446e-bc52-08c48fe1806c-kube-api-access-s6snv\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.354069 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-config-data\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.362812 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-config-data\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.362846 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-fernet-keys\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.362879 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-combined-ca-bundle\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.373074 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6snv\" (UniqueName: \"kubernetes.io/projected/6d5e7575-f4f8-446e-bc52-08c48fe1806c-kube-api-access-s6snv\") pod \"keystone-cron-29496841-n62kw\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.398452 4834 scope.go:117] "RemoveContainer" containerID="02840464958cec4c67536767b1b7a8fa47afb3116df59f556c51d1b87308f474" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.487977 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:00 crc kubenswrapper[4834]: I0130 22:01:00.987037 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29496841-n62kw"] Jan 30 22:01:01 crc kubenswrapper[4834]: I0130 22:01:01.675432 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-n62kw" event={"ID":"6d5e7575-f4f8-446e-bc52-08c48fe1806c","Type":"ContainerStarted","Data":"d788c8a8995efb5a6255ac356f86ad7649adf04c985051dba2983b3a5727e0e7"} Jan 30 22:01:01 crc kubenswrapper[4834]: I0130 22:01:01.675752 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-n62kw" event={"ID":"6d5e7575-f4f8-446e-bc52-08c48fe1806c","Type":"ContainerStarted","Data":"dcd9e8b22abf7e33d850212086f010fb44f0ebe230b0ac1b224338c9ba7f2fde"} Jan 30 22:01:01 crc kubenswrapper[4834]: I0130 22:01:01.695835 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29496841-n62kw" podStartSLOduration=1.69581656 podStartE2EDuration="1.69581656s" podCreationTimestamp="2026-01-30 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 22:01:01.690824439 +0000 UTC m=+2712.843970577" watchObservedRunningTime="2026-01-30 22:01:01.69581656 +0000 UTC m=+2712.848962708" Jan 30 22:01:03 crc kubenswrapper[4834]: I0130 22:01:03.700110 4834 generic.go:334] "Generic (PLEG): container finished" podID="6d5e7575-f4f8-446e-bc52-08c48fe1806c" containerID="d788c8a8995efb5a6255ac356f86ad7649adf04c985051dba2983b3a5727e0e7" exitCode=0 Jan 30 22:01:03 crc kubenswrapper[4834]: I0130 22:01:03.700172 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-n62kw" event={"ID":"6d5e7575-f4f8-446e-bc52-08c48fe1806c","Type":"ContainerDied","Data":"d788c8a8995efb5a6255ac356f86ad7649adf04c985051dba2983b3a5727e0e7"} Jan 30 22:01:04 crc kubenswrapper[4834]: I0130 22:01:04.160789 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:01:04 crc kubenswrapper[4834]: I0130 22:01:04.160907 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.093004 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.159578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6snv\" (UniqueName: \"kubernetes.io/projected/6d5e7575-f4f8-446e-bc52-08c48fe1806c-kube-api-access-s6snv\") pod \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.159741 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-combined-ca-bundle\") pod \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.159974 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-config-data\") pod \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.160006 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-fernet-keys\") pod \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\" (UID: \"6d5e7575-f4f8-446e-bc52-08c48fe1806c\") " Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.165563 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6d5e7575-f4f8-446e-bc52-08c48fe1806c" (UID: "6d5e7575-f4f8-446e-bc52-08c48fe1806c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.165572 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5e7575-f4f8-446e-bc52-08c48fe1806c-kube-api-access-s6snv" (OuterVolumeSpecName: "kube-api-access-s6snv") pod "6d5e7575-f4f8-446e-bc52-08c48fe1806c" (UID: "6d5e7575-f4f8-446e-bc52-08c48fe1806c"). InnerVolumeSpecName "kube-api-access-s6snv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.191831 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d5e7575-f4f8-446e-bc52-08c48fe1806c" (UID: "6d5e7575-f4f8-446e-bc52-08c48fe1806c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.221697 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-config-data" (OuterVolumeSpecName: "config-data") pod "6d5e7575-f4f8-446e-bc52-08c48fe1806c" (UID: "6d5e7575-f4f8-446e-bc52-08c48fe1806c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.262915 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.262942 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.262951 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d5e7575-f4f8-446e-bc52-08c48fe1806c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.262959 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6snv\" (UniqueName: \"kubernetes.io/projected/6d5e7575-f4f8-446e-bc52-08c48fe1806c-kube-api-access-s6snv\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.723707 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29496841-n62kw" event={"ID":"6d5e7575-f4f8-446e-bc52-08c48fe1806c","Type":"ContainerDied","Data":"dcd9e8b22abf7e33d850212086f010fb44f0ebe230b0ac1b224338c9ba7f2fde"} Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.723756 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd9e8b22abf7e33d850212086f010fb44f0ebe230b0ac1b224338c9ba7f2fde" Jan 30 22:01:05 crc kubenswrapper[4834]: I0130 22:01:05.723771 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29496841-n62kw" Jan 30 22:01:28 crc kubenswrapper[4834]: I0130 22:01:28.934408 4834 generic.go:334] "Generic (PLEG): container finished" podID="5f06d94f-e9f1-42b2-9978-b7a128728a60" containerID="30ef7d8d400b58294b73602b449c10e76d133e8bcf23e19f059b6565560d3cd1" exitCode=0 Jan 30 22:01:28 crc kubenswrapper[4834]: I0130 22:01:28.934508 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" event={"ID":"5f06d94f-e9f1-42b2-9978-b7a128728a60","Type":"ContainerDied","Data":"30ef7d8d400b58294b73602b449c10e76d133e8bcf23e19f059b6565560d3cd1"} Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.493958 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591350 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-extra-config-0\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591462 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-1\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591591 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-inventory\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591626 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-0\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591679 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-1\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591812 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-combined-ca-bundle\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591854 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-0\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591905 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-ssh-key-openstack-edpm-ipam\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.591935 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7dlz\" (UniqueName: \"kubernetes.io/projected/5f06d94f-e9f1-42b2-9978-b7a128728a60-kube-api-access-j7dlz\") pod \"5f06d94f-e9f1-42b2-9978-b7a128728a60\" (UID: \"5f06d94f-e9f1-42b2-9978-b7a128728a60\") " Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.596900 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.600306 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f06d94f-e9f1-42b2-9978-b7a128728a60-kube-api-access-j7dlz" (OuterVolumeSpecName: "kube-api-access-j7dlz") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "kube-api-access-j7dlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.627532 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.628233 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.629722 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.631353 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-inventory" (OuterVolumeSpecName: "inventory") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.641552 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.641925 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.653669 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5f06d94f-e9f1-42b2-9978-b7a128728a60" (UID: "5f06d94f-e9f1-42b2-9978-b7a128728a60"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695052 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695430 4834 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695453 4834 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695466 4834 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695478 4834 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695490 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695503 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7dlz\" (UniqueName: \"kubernetes.io/projected/5f06d94f-e9f1-42b2-9978-b7a128728a60-kube-api-access-j7dlz\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695518 4834 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.695530 4834 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5f06d94f-e9f1-42b2-9978-b7a128728a60-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.957521 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" event={"ID":"5f06d94f-e9f1-42b2-9978-b7a128728a60","Type":"ContainerDied","Data":"8a178ca2da1beaa831893e5c55174d0633c595c6dd01311aaf3ebb0eaca543cc"} Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.957563 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a178ca2da1beaa831893e5c55174d0633c595c6dd01311aaf3ebb0eaca543cc" Jan 30 22:01:30 crc kubenswrapper[4834]: I0130 22:01:30.957645 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-2xqvc" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.078066 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj"] Jan 30 22:01:31 crc kubenswrapper[4834]: E0130 22:01:31.078420 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f06d94f-e9f1-42b2-9978-b7a128728a60" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.078451 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f06d94f-e9f1-42b2-9978-b7a128728a60" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 22:01:31 crc kubenswrapper[4834]: E0130 22:01:31.078463 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5e7575-f4f8-446e-bc52-08c48fe1806c" containerName="keystone-cron" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.078470 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5e7575-f4f8-446e-bc52-08c48fe1806c" containerName="keystone-cron" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.078689 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f06d94f-e9f1-42b2-9978-b7a128728a60" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.078713 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5e7575-f4f8-446e-bc52-08c48fe1806c" containerName="keystone-cron" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.079368 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.081331 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.081584 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.082117 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.082358 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.084298 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.090460 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj"] Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203523 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203571 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203654 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203778 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203830 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtnh\" (UniqueName: \"kubernetes.io/projected/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-kube-api-access-9gtnh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.203876 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.305893 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.305953 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtnh\" (UniqueName: \"kubernetes.io/projected/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-kube-api-access-9gtnh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.305991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.306071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.306114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.306147 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.306229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.311037 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.311764 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.311961 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.313518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.313586 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.314481 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.333391 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtnh\" (UniqueName: \"kubernetes.io/projected/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-kube-api-access-9gtnh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.398996 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.974547 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj"] Jan 30 22:01:31 crc kubenswrapper[4834]: I0130 22:01:31.977066 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:01:32 crc kubenswrapper[4834]: I0130 22:01:32.975812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" event={"ID":"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7","Type":"ContainerStarted","Data":"75228a7008cedac789e91a0010e2a313cd09adcf9decd55ce39d9f15989e5765"} Jan 30 22:01:32 crc kubenswrapper[4834]: I0130 22:01:32.976457 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" event={"ID":"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7","Type":"ContainerStarted","Data":"38d0f0727cee05fc20a73fc6a7afcc66b505201e6adbc9fae584fc5348e70465"} Jan 30 22:01:32 crc kubenswrapper[4834]: I0130 22:01:32.999689 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" podStartSLOduration=1.575401499 podStartE2EDuration="1.999669985s" podCreationTimestamp="2026-01-30 22:01:31 +0000 UTC" firstStartedPulling="2026-01-30 22:01:31.97682642 +0000 UTC m=+2743.129972558" lastFinishedPulling="2026-01-30 22:01:32.401094896 +0000 UTC m=+2743.554241044" observedRunningTime="2026-01-30 22:01:32.992305307 +0000 UTC m=+2744.145451455" watchObservedRunningTime="2026-01-30 22:01:32.999669985 +0000 UTC m=+2744.152816123" Jan 30 22:01:34 crc kubenswrapper[4834]: I0130 22:01:34.161752 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:01:34 crc kubenswrapper[4834]: I0130 22:01:34.162001 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:02:04 crc kubenswrapper[4834]: I0130 22:02:04.161297 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:02:04 crc kubenswrapper[4834]: I0130 22:02:04.161986 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:02:04 crc kubenswrapper[4834]: I0130 22:02:04.162042 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 22:02:04 crc kubenswrapper[4834]: I0130 22:02:04.163029 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bf310a9d0e5abd60888ccf207fce82c1cbc19301cf50c7335e1336ee9e1f0bd"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:02:04 crc kubenswrapper[4834]: I0130 22:02:04.163227 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://1bf310a9d0e5abd60888ccf207fce82c1cbc19301cf50c7335e1336ee9e1f0bd" gracePeriod=600 Jan 30 22:02:05 crc kubenswrapper[4834]: I0130 22:02:05.305280 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="1bf310a9d0e5abd60888ccf207fce82c1cbc19301cf50c7335e1336ee9e1f0bd" exitCode=0 Jan 30 22:02:05 crc kubenswrapper[4834]: I0130 22:02:05.305370 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"1bf310a9d0e5abd60888ccf207fce82c1cbc19301cf50c7335e1336ee9e1f0bd"} Jan 30 22:02:05 crc kubenswrapper[4834]: I0130 22:02:05.305652 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682"} Jan 30 22:02:05 crc kubenswrapper[4834]: I0130 22:02:05.305676 4834 scope.go:117] "RemoveContainer" containerID="c83a68d65516541aa876e90f0eea68cc47e9eff81c2a85534a4ba1a58804fe87" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.630498 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-92h2h"] Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.632780 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.642757 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92h2h"] Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.673994 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhfb\" (UniqueName: \"kubernetes.io/projected/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-kube-api-access-6mhfb\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.674291 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-catalog-content\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.674717 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-utilities\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.776958 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-utilities\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.777458 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhfb\" (UniqueName: \"kubernetes.io/projected/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-kube-api-access-6mhfb\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.777588 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-catalog-content\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.777478 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-utilities\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.778046 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-catalog-content\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.800506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhfb\" (UniqueName: \"kubernetes.io/projected/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-kube-api-access-6mhfb\") pod \"redhat-operators-92h2h\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:56 crc kubenswrapper[4834]: I0130 22:02:56.958538 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:02:57 crc kubenswrapper[4834]: I0130 22:02:57.458211 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-92h2h"] Jan 30 22:02:57 crc kubenswrapper[4834]: I0130 22:02:57.815876 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerID="e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930" exitCode=0 Jan 30 22:02:57 crc kubenswrapper[4834]: I0130 22:02:57.815924 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92h2h" event={"ID":"c9e1c04b-0a98-46e0-9343-6b69be85c5ce","Type":"ContainerDied","Data":"e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930"} Jan 30 22:02:57 crc kubenswrapper[4834]: I0130 22:02:57.815954 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92h2h" event={"ID":"c9e1c04b-0a98-46e0-9343-6b69be85c5ce","Type":"ContainerStarted","Data":"f5b2841ac19a818af04491c3ab1c1b8e586fb2ea6afd71b99db1fc7ea076ab6b"} Jan 30 22:02:59 crc kubenswrapper[4834]: I0130 22:02:59.832981 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerID="0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba" exitCode=0 Jan 30 22:02:59 crc kubenswrapper[4834]: I0130 22:02:59.833206 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92h2h" event={"ID":"c9e1c04b-0a98-46e0-9343-6b69be85c5ce","Type":"ContainerDied","Data":"0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba"} Jan 30 22:03:00 crc kubenswrapper[4834]: I0130 22:03:00.565303 4834 scope.go:117] "RemoveContainer" containerID="b1babb52543fe618601978565f277ae9bdc7315ae4f439bd27ab08932f36286b" Jan 30 22:03:00 crc kubenswrapper[4834]: I0130 22:03:00.589947 4834 scope.go:117] "RemoveContainer" containerID="814e4641c7e44cdec29bef1a1e0f59d9c0c912f99715b04aa6b1d4e2dbd674b5" Jan 30 22:03:00 crc kubenswrapper[4834]: I0130 22:03:00.618726 4834 scope.go:117] "RemoveContainer" containerID="a2cf3f193b2af2cb1d3ccf1d5b6225ff9cc3cbca2f7dc8d143010e6815d4579e" Jan 30 22:03:05 crc kubenswrapper[4834]: I0130 22:03:05.905919 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92h2h" event={"ID":"c9e1c04b-0a98-46e0-9343-6b69be85c5ce","Type":"ContainerStarted","Data":"d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125"} Jan 30 22:03:05 crc kubenswrapper[4834]: I0130 22:03:05.930336 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-92h2h" podStartSLOduration=2.781895805 podStartE2EDuration="9.930313921s" podCreationTimestamp="2026-01-30 22:02:56 +0000 UTC" firstStartedPulling="2026-01-30 22:02:57.818660644 +0000 UTC m=+2828.971806782" lastFinishedPulling="2026-01-30 22:03:04.96707876 +0000 UTC m=+2836.120224898" observedRunningTime="2026-01-30 22:03:05.923585661 +0000 UTC m=+2837.076731809" watchObservedRunningTime="2026-01-30 22:03:05.930313921 +0000 UTC m=+2837.083460059" Jan 30 22:03:06 crc kubenswrapper[4834]: I0130 22:03:06.959337 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:03:06 crc kubenswrapper[4834]: I0130 22:03:06.960253 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:03:08 crc kubenswrapper[4834]: I0130 22:03:08.003436 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-92h2h" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="registry-server" probeResult="failure" output=< Jan 30 22:03:08 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 22:03:08 crc kubenswrapper[4834]: > Jan 30 22:03:17 crc kubenswrapper[4834]: I0130 22:03:17.028854 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:03:17 crc kubenswrapper[4834]: I0130 22:03:17.082027 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:03:17 crc kubenswrapper[4834]: I0130 22:03:17.275202 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92h2h"] Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.024006 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-92h2h" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="registry-server" containerID="cri-o://d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125" gracePeriod=2 Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.622929 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.698810 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-utilities\") pod \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.698860 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-catalog-content\") pod \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.699978 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-utilities" (OuterVolumeSpecName: "utilities") pod "c9e1c04b-0a98-46e0-9343-6b69be85c5ce" (UID: "c9e1c04b-0a98-46e0-9343-6b69be85c5ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.700577 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.801701 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mhfb\" (UniqueName: \"kubernetes.io/projected/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-kube-api-access-6mhfb\") pod \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\" (UID: \"c9e1c04b-0a98-46e0-9343-6b69be85c5ce\") " Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.808294 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-kube-api-access-6mhfb" (OuterVolumeSpecName: "kube-api-access-6mhfb") pod "c9e1c04b-0a98-46e0-9343-6b69be85c5ce" (UID: "c9e1c04b-0a98-46e0-9343-6b69be85c5ce"). InnerVolumeSpecName "kube-api-access-6mhfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.835265 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9e1c04b-0a98-46e0-9343-6b69be85c5ce" (UID: "c9e1c04b-0a98-46e0-9343-6b69be85c5ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.903833 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mhfb\" (UniqueName: \"kubernetes.io/projected/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-kube-api-access-6mhfb\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:19 crc kubenswrapper[4834]: I0130 22:03:19.903894 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e1c04b-0a98-46e0-9343-6b69be85c5ce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.035892 4834 generic.go:334] "Generic (PLEG): container finished" podID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerID="d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125" exitCode=0 Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.035947 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92h2h" event={"ID":"c9e1c04b-0a98-46e0-9343-6b69be85c5ce","Type":"ContainerDied","Data":"d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125"} Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.035987 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-92h2h" event={"ID":"c9e1c04b-0a98-46e0-9343-6b69be85c5ce","Type":"ContainerDied","Data":"f5b2841ac19a818af04491c3ab1c1b8e586fb2ea6afd71b99db1fc7ea076ab6b"} Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.036013 4834 scope.go:117] "RemoveContainer" containerID="d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.036147 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-92h2h" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.059957 4834 scope.go:117] "RemoveContainer" containerID="0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.093453 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-92h2h"] Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.097556 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-92h2h"] Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.106825 4834 scope.go:117] "RemoveContainer" containerID="e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.147724 4834 scope.go:117] "RemoveContainer" containerID="d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125" Jan 30 22:03:20 crc kubenswrapper[4834]: E0130 22:03:20.148221 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125\": container with ID starting with d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125 not found: ID does not exist" containerID="d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.148272 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125"} err="failed to get container status \"d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125\": rpc error: code = NotFound desc = could not find container \"d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125\": container with ID starting with d8ed82352316940961294fa221e56b0729604af72d1ea860ddf4a08f5bed9125 not found: ID does not exist" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.148299 4834 scope.go:117] "RemoveContainer" containerID="0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba" Jan 30 22:03:20 crc kubenswrapper[4834]: E0130 22:03:20.148909 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba\": container with ID starting with 0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba not found: ID does not exist" containerID="0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.148939 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba"} err="failed to get container status \"0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba\": rpc error: code = NotFound desc = could not find container \"0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba\": container with ID starting with 0a243a88d1f2b1b0de2acf6330ffc1ebcbf1ba13dc64e79d7c105401e1a06bba not found: ID does not exist" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.148957 4834 scope.go:117] "RemoveContainer" containerID="e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930" Jan 30 22:03:20 crc kubenswrapper[4834]: E0130 22:03:20.149408 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930\": container with ID starting with e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930 not found: ID does not exist" containerID="e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930" Jan 30 22:03:20 crc kubenswrapper[4834]: I0130 22:03:20.149433 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930"} err="failed to get container status \"e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930\": rpc error: code = NotFound desc = could not find container \"e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930\": container with ID starting with e8419f0546d50c4510615624d636829c9976b4295c28fcea6c060a5a1809e930 not found: ID does not exist" Jan 30 22:03:21 crc kubenswrapper[4834]: I0130 22:03:21.542196 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" path="/var/lib/kubelet/pods/c9e1c04b-0a98-46e0-9343-6b69be85c5ce/volumes" Jan 30 22:03:45 crc kubenswrapper[4834]: I0130 22:03:45.302633 4834 generic.go:334] "Generic (PLEG): container finished" podID="dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" containerID="75228a7008cedac789e91a0010e2a313cd09adcf9decd55ce39d9f15989e5765" exitCode=0 Jan 30 22:03:45 crc kubenswrapper[4834]: I0130 22:03:45.302735 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" event={"ID":"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7","Type":"ContainerDied","Data":"75228a7008cedac789e91a0010e2a313cd09adcf9decd55ce39d9f15989e5765"} Jan 30 22:03:46 crc kubenswrapper[4834]: I0130 22:03:46.955883 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.010928 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-2\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.011027 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-1\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.011265 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-0\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.011315 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-telemetry-combined-ca-bundle\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.011355 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ssh-key-openstack-edpm-ipam\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.011414 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gtnh\" (UniqueName: \"kubernetes.io/projected/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-kube-api-access-9gtnh\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.011476 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-inventory\") pod \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\" (UID: \"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7\") " Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.018619 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.020291 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-kube-api-access-9gtnh" (OuterVolumeSpecName: "kube-api-access-9gtnh") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "kube-api-access-9gtnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.050758 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.060233 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-inventory" (OuterVolumeSpecName: "inventory") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.062821 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.063679 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.074223 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" (UID: "dda5e26c-7c67-4aa6-9dea-dde313cbb7d7"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114156 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114204 4834 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114220 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114234 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gtnh\" (UniqueName: \"kubernetes.io/projected/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-kube-api-access-9gtnh\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114248 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114260 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.114272 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/dda5e26c-7c67-4aa6-9dea-dde313cbb7d7-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.329329 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" event={"ID":"dda5e26c-7c67-4aa6-9dea-dde313cbb7d7","Type":"ContainerDied","Data":"38d0f0727cee05fc20a73fc6a7afcc66b505201e6adbc9fae584fc5348e70465"} Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.329370 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38d0f0727cee05fc20a73fc6a7afcc66b505201e6adbc9fae584fc5348e70465" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.329375 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.457363 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd"] Jan 30 22:03:47 crc kubenswrapper[4834]: E0130 22:03:47.458654 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="extract-utilities" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.458687 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="extract-utilities" Jan 30 22:03:47 crc kubenswrapper[4834]: E0130 22:03:47.458728 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.458744 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 22:03:47 crc kubenswrapper[4834]: E0130 22:03:47.458774 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="registry-server" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.458783 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="registry-server" Jan 30 22:03:47 crc kubenswrapper[4834]: E0130 22:03:47.458811 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="extract-content" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.458819 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="extract-content" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.459106 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda5e26c-7c67-4aa6-9dea-dde313cbb7d7" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.459136 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e1c04b-0a98-46e0-9343-6b69be85c5ce" containerName="registry-server" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.460327 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.463366 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.463449 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.463464 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.464057 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.481919 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd"] Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.526092 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-tqhxp" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.628708 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.628791 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.628815 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.628905 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7d9\" (UniqueName: \"kubernetes.io/projected/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-kube-api-access-zm7d9\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.629164 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.731205 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7d9\" (UniqueName: \"kubernetes.io/projected/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-kube-api-access-zm7d9\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.731334 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.732066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.732267 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.732313 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.736028 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.736080 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.736444 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.745521 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.752676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7d9\" (UniqueName: \"kubernetes.io/projected/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-kube-api-access-zm7d9\") pod \"logging-edpm-deployment-openstack-edpm-ipam-mxjcd\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:47 crc kubenswrapper[4834]: I0130 22:03:47.841605 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:03:48 crc kubenswrapper[4834]: I0130 22:03:48.360270 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd"] Jan 30 22:03:49 crc kubenswrapper[4834]: I0130 22:03:49.354685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" event={"ID":"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5","Type":"ContainerStarted","Data":"171ea685a27c2771e839a4457ad5fb7adeb9cf22baabfb433cd62d65c93323e2"} Jan 30 22:03:49 crc kubenswrapper[4834]: I0130 22:03:49.355300 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" event={"ID":"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5","Type":"ContainerStarted","Data":"089be12642b5de859a0dd8839097bb21e2933a5407818c9ae56568c988258103"} Jan 30 22:03:49 crc kubenswrapper[4834]: I0130 22:03:49.379869 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" podStartSLOduration=1.9454757200000001 podStartE2EDuration="2.379851861s" podCreationTimestamp="2026-01-30 22:03:47 +0000 UTC" firstStartedPulling="2026-01-30 22:03:48.368385317 +0000 UTC m=+2879.521531455" lastFinishedPulling="2026-01-30 22:03:48.802761458 +0000 UTC m=+2879.955907596" observedRunningTime="2026-01-30 22:03:49.373970684 +0000 UTC m=+2880.527116822" watchObservedRunningTime="2026-01-30 22:03:49.379851861 +0000 UTC m=+2880.532997989" Jan 30 22:04:02 crc kubenswrapper[4834]: I0130 22:04:02.471947 4834 generic.go:334] "Generic (PLEG): container finished" podID="b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" containerID="171ea685a27c2771e839a4457ad5fb7adeb9cf22baabfb433cd62d65c93323e2" exitCode=0 Jan 30 22:04:02 crc kubenswrapper[4834]: I0130 22:04:02.472056 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" event={"ID":"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5","Type":"ContainerDied","Data":"171ea685a27c2771e839a4457ad5fb7adeb9cf22baabfb433cd62d65c93323e2"} Jan 30 22:04:03 crc kubenswrapper[4834]: I0130 22:04:03.985427 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:04:03 crc kubenswrapper[4834]: I0130 22:04:03.998326 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-1\") pod \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " Jan 30 22:04:03 crc kubenswrapper[4834]: I0130 22:04:03.998484 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-inventory\") pod \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " Jan 30 22:04:03 crc kubenswrapper[4834]: I0130 22:04:03.998516 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-0\") pod \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " Jan 30 22:04:03 crc kubenswrapper[4834]: I0130 22:04:03.998582 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-ssh-key-openstack-edpm-ipam\") pod \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " Jan 30 22:04:03 crc kubenswrapper[4834]: I0130 22:04:03.998724 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm7d9\" (UniqueName: \"kubernetes.io/projected/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-kube-api-access-zm7d9\") pod \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\" (UID: \"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5\") " Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.013772 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-kube-api-access-zm7d9" (OuterVolumeSpecName: "kube-api-access-zm7d9") pod "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" (UID: "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5"). InnerVolumeSpecName "kube-api-access-zm7d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.049709 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" (UID: "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.052575 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" (UID: "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.060654 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-inventory" (OuterVolumeSpecName: "inventory") pod "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" (UID: "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.079366 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" (UID: "b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.101326 4834 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-inventory\") on node \"crc\" DevicePath \"\"" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.101361 4834 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.101375 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.101388 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm7d9\" (UniqueName: \"kubernetes.io/projected/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-kube-api-access-zm7d9\") on node \"crc\" DevicePath \"\"" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.101419 4834 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.160683 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.160730 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.491873 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" event={"ID":"b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5","Type":"ContainerDied","Data":"089be12642b5de859a0dd8839097bb21e2933a5407818c9ae56568c988258103"} Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.492178 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="089be12642b5de859a0dd8839097bb21e2933a5407818c9ae56568c988258103" Jan 30 22:04:04 crc kubenswrapper[4834]: I0130 22:04:04.491944 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-mxjcd" Jan 30 22:04:34 crc kubenswrapper[4834]: I0130 22:04:34.160851 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:04:34 crc kubenswrapper[4834]: I0130 22:04:34.161819 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:05:04 crc kubenswrapper[4834]: I0130 22:05:04.160721 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:05:04 crc kubenswrapper[4834]: I0130 22:05:04.161404 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:05:04 crc kubenswrapper[4834]: I0130 22:05:04.161449 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 22:05:04 crc kubenswrapper[4834]: I0130 22:05:04.162004 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:05:04 crc kubenswrapper[4834]: I0130 22:05:04.162058 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" gracePeriod=600 Jan 30 22:05:04 crc kubenswrapper[4834]: E0130 22:05:04.295669 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:05:05 crc kubenswrapper[4834]: I0130 22:05:05.073207 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" exitCode=0 Jan 30 22:05:05 crc kubenswrapper[4834]: I0130 22:05:05.073272 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682"} Jan 30 22:05:05 crc kubenswrapper[4834]: I0130 22:05:05.073341 4834 scope.go:117] "RemoveContainer" containerID="1bf310a9d0e5abd60888ccf207fce82c1cbc19301cf50c7335e1336ee9e1f0bd" Jan 30 22:05:05 crc kubenswrapper[4834]: I0130 22:05:05.074077 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:05:05 crc kubenswrapper[4834]: E0130 22:05:05.074563 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:05:20 crc kubenswrapper[4834]: I0130 22:05:20.532580 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:05:20 crc kubenswrapper[4834]: E0130 22:05:20.534558 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:05:35 crc kubenswrapper[4834]: I0130 22:05:35.532269 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:05:35 crc kubenswrapper[4834]: E0130 22:05:35.533216 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:05:49 crc kubenswrapper[4834]: I0130 22:05:49.545004 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:05:49 crc kubenswrapper[4834]: E0130 22:05:49.546033 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:06:03 crc kubenswrapper[4834]: I0130 22:06:03.532270 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:06:03 crc kubenswrapper[4834]: E0130 22:06:03.533294 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:06:14 crc kubenswrapper[4834]: I0130 22:06:14.531374 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:06:14 crc kubenswrapper[4834]: E0130 22:06:14.532323 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:23.999449 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 22:06:24 crc kubenswrapper[4834]: E0130 22:06:24.000630 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.000655 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.000963 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5" containerName="logging-edpm-deployment-openstack-edpm-ipam" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.002797 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.007086 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.007354 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q6nhs" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.007557 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.007699 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.015612 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.187742 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.187796 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.187851 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdbxw\" (UniqueName: \"kubernetes.io/projected/acbc4be9-32f5-471a-b881-578a0b7b715f-kube-api-access-tdbxw\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.187970 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.188035 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.188056 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.188102 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.188301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.188463 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-config-data\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.289954 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.290044 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.290073 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.290126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.290562 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.290622 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-config-data\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.290708 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.291023 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.291185 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.291484 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.291914 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-config-data\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.291966 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.292003 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.292040 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdbxw\" (UniqueName: \"kubernetes.io/projected/acbc4be9-32f5-471a-b881-578a0b7b715f-kube-api-access-tdbxw\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.305329 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.305334 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.305470 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.309175 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdbxw\" (UniqueName: \"kubernetes.io/projected/acbc4be9-32f5-471a-b881-578a0b7b715f-kube-api-access-tdbxw\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.335091 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"tempest-tests-tempest\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.344916 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.822696 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 30 22:06:24 crc kubenswrapper[4834]: I0130 22:06:24.915326 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acbc4be9-32f5-471a-b881-578a0b7b715f","Type":"ContainerStarted","Data":"6fb9dd629921ab700fc4a75ebb9ebd38bfa3f99af5735bc077c85e3ddfe4dbff"} Jan 30 22:06:27 crc kubenswrapper[4834]: I0130 22:06:27.533146 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:06:27 crc kubenswrapper[4834]: E0130 22:06:27.534038 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:06:41 crc kubenswrapper[4834]: I0130 22:06:41.531039 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:06:41 crc kubenswrapper[4834]: E0130 22:06:41.531797 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:06:41 crc kubenswrapper[4834]: I0130 22:06:41.980094 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z6tph"] Jan 30 22:06:41 crc kubenswrapper[4834]: I0130 22:06:41.992977 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6tph"] Jan 30 22:06:41 crc kubenswrapper[4834]: I0130 22:06:41.993119 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.150686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-utilities\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.150759 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c476z\" (UniqueName: \"kubernetes.io/projected/fc9056bc-6e8d-410f-a2d1-cf69ec103561-kube-api-access-c476z\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.151065 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-catalog-content\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.252700 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-catalog-content\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.252857 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-utilities\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.252888 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c476z\" (UniqueName: \"kubernetes.io/projected/fc9056bc-6e8d-410f-a2d1-cf69ec103561-kube-api-access-c476z\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.253256 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-catalog-content\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.253503 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-utilities\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.292459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c476z\" (UniqueName: \"kubernetes.io/projected/fc9056bc-6e8d-410f-a2d1-cf69ec103561-kube-api-access-c476z\") pod \"redhat-marketplace-z6tph\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:42 crc kubenswrapper[4834]: I0130 22:06:42.325091 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:06:53 crc kubenswrapper[4834]: I0130 22:06:53.530775 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:06:53 crc kubenswrapper[4834]: E0130 22:06:53.531506 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:06:58 crc kubenswrapper[4834]: E0130 22:06:58.926812 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 30 22:06:58 crc kubenswrapper[4834]: E0130 22:06:58.927488 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdbxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(acbc4be9-32f5-471a-b881-578a0b7b715f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 30 22:06:58 crc kubenswrapper[4834]: E0130 22:06:58.928855 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="acbc4be9-32f5-471a-b881-578a0b7b715f" Jan 30 22:06:59 crc kubenswrapper[4834]: I0130 22:06:59.325346 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6tph"] Jan 30 22:06:59 crc kubenswrapper[4834]: I0130 22:06:59.719320 4834 generic.go:334] "Generic (PLEG): container finished" podID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerID="3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee" exitCode=0 Jan 30 22:06:59 crc kubenswrapper[4834]: I0130 22:06:59.720009 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerDied","Data":"3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee"} Jan 30 22:06:59 crc kubenswrapper[4834]: I0130 22:06:59.720109 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerStarted","Data":"18f61f99fe52b7e2887d21b52ad3b46cedb89a3ead88b727c336ce8c98b8879f"} Jan 30 22:06:59 crc kubenswrapper[4834]: I0130 22:06:59.720985 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:06:59 crc kubenswrapper[4834]: E0130 22:06:59.721209 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="acbc4be9-32f5-471a-b881-578a0b7b715f" Jan 30 22:07:00 crc kubenswrapper[4834]: I0130 22:07:00.735489 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerStarted","Data":"146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f"} Jan 30 22:07:01 crc kubenswrapper[4834]: I0130 22:07:01.749255 4834 generic.go:334] "Generic (PLEG): container finished" podID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerID="146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f" exitCode=0 Jan 30 22:07:01 crc kubenswrapper[4834]: I0130 22:07:01.749354 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerDied","Data":"146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f"} Jan 30 22:07:02 crc kubenswrapper[4834]: I0130 22:07:02.760964 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerStarted","Data":"073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66"} Jan 30 22:07:02 crc kubenswrapper[4834]: I0130 22:07:02.786809 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z6tph" podStartSLOduration=19.35051205 podStartE2EDuration="21.786786123s" podCreationTimestamp="2026-01-30 22:06:41 +0000 UTC" firstStartedPulling="2026-01-30 22:06:59.720763038 +0000 UTC m=+3070.873909176" lastFinishedPulling="2026-01-30 22:07:02.157037101 +0000 UTC m=+3073.310183249" observedRunningTime="2026-01-30 22:07:02.784373205 +0000 UTC m=+3073.937519343" watchObservedRunningTime="2026-01-30 22:07:02.786786123 +0000 UTC m=+3073.939932261" Jan 30 22:07:07 crc kubenswrapper[4834]: I0130 22:07:07.531350 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:07:07 crc kubenswrapper[4834]: E0130 22:07:07.532713 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:07:12 crc kubenswrapper[4834]: I0130 22:07:12.325665 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:07:12 crc kubenswrapper[4834]: I0130 22:07:12.326248 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:07:12 crc kubenswrapper[4834]: I0130 22:07:12.370025 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:07:12 crc kubenswrapper[4834]: I0130 22:07:12.933650 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:07:13 crc kubenswrapper[4834]: I0130 22:07:13.168688 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6tph"] Jan 30 22:07:14 crc kubenswrapper[4834]: I0130 22:07:14.900182 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z6tph" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="registry-server" containerID="cri-o://073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66" gracePeriod=2 Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.417671 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.864062 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.918599 4834 generic.go:334] "Generic (PLEG): container finished" podID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerID="073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66" exitCode=0 Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.918642 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerDied","Data":"073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66"} Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.918674 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6tph" event={"ID":"fc9056bc-6e8d-410f-a2d1-cf69ec103561","Type":"ContainerDied","Data":"18f61f99fe52b7e2887d21b52ad3b46cedb89a3ead88b727c336ce8c98b8879f"} Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.918694 4834 scope.go:117] "RemoveContainer" containerID="073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.919377 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6tph" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.927946 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c476z\" (UniqueName: \"kubernetes.io/projected/fc9056bc-6e8d-410f-a2d1-cf69ec103561-kube-api-access-c476z\") pod \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.928084 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-catalog-content\") pod \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.928206 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-utilities\") pod \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\" (UID: \"fc9056bc-6e8d-410f-a2d1-cf69ec103561\") " Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.930013 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-utilities" (OuterVolumeSpecName: "utilities") pod "fc9056bc-6e8d-410f-a2d1-cf69ec103561" (UID: "fc9056bc-6e8d-410f-a2d1-cf69ec103561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.933704 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9056bc-6e8d-410f-a2d1-cf69ec103561-kube-api-access-c476z" (OuterVolumeSpecName: "kube-api-access-c476z") pod "fc9056bc-6e8d-410f-a2d1-cf69ec103561" (UID: "fc9056bc-6e8d-410f-a2d1-cf69ec103561"). InnerVolumeSpecName "kube-api-access-c476z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.945720 4834 scope.go:117] "RemoveContainer" containerID="146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.951273 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc9056bc-6e8d-410f-a2d1-cf69ec103561" (UID: "fc9056bc-6e8d-410f-a2d1-cf69ec103561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:07:15 crc kubenswrapper[4834]: I0130 22:07:15.989062 4834 scope.go:117] "RemoveContainer" containerID="3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.017846 4834 scope.go:117] "RemoveContainer" containerID="073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66" Jan 30 22:07:16 crc kubenswrapper[4834]: E0130 22:07:16.018451 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66\": container with ID starting with 073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66 not found: ID does not exist" containerID="073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.018487 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66"} err="failed to get container status \"073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66\": rpc error: code = NotFound desc = could not find container \"073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66\": container with ID starting with 073cff9331796971aa95468b78dac94d2b68bd5a52e8b948e62f49046cc5dd66 not found: ID does not exist" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.018509 4834 scope.go:117] "RemoveContainer" containerID="146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f" Jan 30 22:07:16 crc kubenswrapper[4834]: E0130 22:07:16.018846 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f\": container with ID starting with 146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f not found: ID does not exist" containerID="146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.018870 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f"} err="failed to get container status \"146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f\": rpc error: code = NotFound desc = could not find container \"146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f\": container with ID starting with 146df31859fb4c1bcc0c306eb9d07a96dafbfe682f3f4cc44632daf4773df46f not found: ID does not exist" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.018884 4834 scope.go:117] "RemoveContainer" containerID="3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee" Jan 30 22:07:16 crc kubenswrapper[4834]: E0130 22:07:16.019112 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee\": container with ID starting with 3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee not found: ID does not exist" containerID="3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.019140 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee"} err="failed to get container status \"3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee\": rpc error: code = NotFound desc = could not find container \"3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee\": container with ID starting with 3b0c6492bdc8dea742bcd72610f9f72c6f2691adb0d0019ad2c66108afc5f5ee not found: ID does not exist" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.031918 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.031962 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9056bc-6e8d-410f-a2d1-cf69ec103561-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.031976 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c476z\" (UniqueName: \"kubernetes.io/projected/fc9056bc-6e8d-410f-a2d1-cf69ec103561-kube-api-access-c476z\") on node \"crc\" DevicePath \"\"" Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.260826 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6tph"] Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.270758 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6tph"] Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.930903 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acbc4be9-32f5-471a-b881-578a0b7b715f","Type":"ContainerStarted","Data":"fbbd820c6837b0e532a3f0b63fad3e8b385b78b52304a20feb2cbe746ba8bab4"} Jan 30 22:07:16 crc kubenswrapper[4834]: I0130 22:07:16.946186 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.358138419 podStartE2EDuration="54.946166727s" podCreationTimestamp="2026-01-30 22:06:22 +0000 UTC" firstStartedPulling="2026-01-30 22:06:24.826676311 +0000 UTC m=+3035.979822449" lastFinishedPulling="2026-01-30 22:07:15.414704609 +0000 UTC m=+3086.567850757" observedRunningTime="2026-01-30 22:07:16.945249772 +0000 UTC m=+3088.098395910" watchObservedRunningTime="2026-01-30 22:07:16.946166727 +0000 UTC m=+3088.099312865" Jan 30 22:07:17 crc kubenswrapper[4834]: I0130 22:07:17.545344 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" path="/var/lib/kubelet/pods/fc9056bc-6e8d-410f-a2d1-cf69ec103561/volumes" Jan 30 22:07:21 crc kubenswrapper[4834]: I0130 22:07:21.532124 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:07:21 crc kubenswrapper[4834]: E0130 22:07:21.533723 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:07:32 crc kubenswrapper[4834]: I0130 22:07:32.531112 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:07:32 crc kubenswrapper[4834]: E0130 22:07:32.532058 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:07:44 crc kubenswrapper[4834]: I0130 22:07:44.531547 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:07:44 crc kubenswrapper[4834]: E0130 22:07:44.532387 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:07:58 crc kubenswrapper[4834]: I0130 22:07:58.532552 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:07:58 crc kubenswrapper[4834]: E0130 22:07:58.533429 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:08:11 crc kubenswrapper[4834]: I0130 22:08:11.532247 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:08:11 crc kubenswrapper[4834]: E0130 22:08:11.533568 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:08:22 crc kubenswrapper[4834]: I0130 22:08:22.531652 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:08:22 crc kubenswrapper[4834]: E0130 22:08:22.532512 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:08:36 crc kubenswrapper[4834]: I0130 22:08:36.531137 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:08:36 crc kubenswrapper[4834]: E0130 22:08:36.531991 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:08:49 crc kubenswrapper[4834]: I0130 22:08:49.540020 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:08:49 crc kubenswrapper[4834]: E0130 22:08:49.540784 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:09:02 crc kubenswrapper[4834]: I0130 22:09:02.531254 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:09:02 crc kubenswrapper[4834]: E0130 22:09:02.532235 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:09:17 crc kubenswrapper[4834]: I0130 22:09:17.531379 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:09:17 crc kubenswrapper[4834]: E0130 22:09:17.532040 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:09:28 crc kubenswrapper[4834]: I0130 22:09:28.531545 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:09:28 crc kubenswrapper[4834]: E0130 22:09:28.532489 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.557093 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fzfhp"] Jan 30 22:09:37 crc kubenswrapper[4834]: E0130 22:09:37.558157 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="extract-utilities" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.558177 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="extract-utilities" Jan 30 22:09:37 crc kubenswrapper[4834]: E0130 22:09:37.558217 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="registry-server" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.558225 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="registry-server" Jan 30 22:09:37 crc kubenswrapper[4834]: E0130 22:09:37.558247 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="extract-content" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.558257 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="extract-content" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.558515 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9056bc-6e8d-410f-a2d1-cf69ec103561" containerName="registry-server" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.560333 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.568905 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzfhp"] Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.634918 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24d74\" (UniqueName: \"kubernetes.io/projected/355fa518-4d75-4888-be7d-93d00863cb38-kube-api-access-24d74\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.634970 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-utilities\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.635210 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-catalog-content\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.736913 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24d74\" (UniqueName: \"kubernetes.io/projected/355fa518-4d75-4888-be7d-93d00863cb38-kube-api-access-24d74\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.736963 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-utilities\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.737462 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-utilities\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.737520 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-catalog-content\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.737773 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-catalog-content\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.761562 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24d74\" (UniqueName: \"kubernetes.io/projected/355fa518-4d75-4888-be7d-93d00863cb38-kube-api-access-24d74\") pod \"community-operators-fzfhp\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:37 crc kubenswrapper[4834]: I0130 22:09:37.896205 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:38 crc kubenswrapper[4834]: I0130 22:09:38.475635 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fzfhp"] Jan 30 22:09:39 crc kubenswrapper[4834]: I0130 22:09:39.266259 4834 generic.go:334] "Generic (PLEG): container finished" podID="355fa518-4d75-4888-be7d-93d00863cb38" containerID="717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9" exitCode=0 Jan 30 22:09:39 crc kubenswrapper[4834]: I0130 22:09:39.266318 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerDied","Data":"717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9"} Jan 30 22:09:39 crc kubenswrapper[4834]: I0130 22:09:39.266574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerStarted","Data":"55f86900e69a92e8b8b0162d68074a90d3955b1d7463fd93289a7b28a8769b62"} Jan 30 22:09:41 crc kubenswrapper[4834]: I0130 22:09:41.288652 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerStarted","Data":"99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af"} Jan 30 22:09:41 crc kubenswrapper[4834]: I0130 22:09:41.531045 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:09:41 crc kubenswrapper[4834]: E0130 22:09:41.531343 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:09:44 crc kubenswrapper[4834]: I0130 22:09:44.317931 4834 generic.go:334] "Generic (PLEG): container finished" podID="355fa518-4d75-4888-be7d-93d00863cb38" containerID="99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af" exitCode=0 Jan 30 22:09:44 crc kubenswrapper[4834]: I0130 22:09:44.318005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerDied","Data":"99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af"} Jan 30 22:09:45 crc kubenswrapper[4834]: I0130 22:09:45.330029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerStarted","Data":"666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7"} Jan 30 22:09:45 crc kubenswrapper[4834]: I0130 22:09:45.354781 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fzfhp" podStartSLOduration=2.8491669809999998 podStartE2EDuration="8.354764792s" podCreationTimestamp="2026-01-30 22:09:37 +0000 UTC" firstStartedPulling="2026-01-30 22:09:39.26868554 +0000 UTC m=+3230.421831678" lastFinishedPulling="2026-01-30 22:09:44.774283331 +0000 UTC m=+3235.927429489" observedRunningTime="2026-01-30 22:09:45.350735869 +0000 UTC m=+3236.503882007" watchObservedRunningTime="2026-01-30 22:09:45.354764792 +0000 UTC m=+3236.507910930" Jan 30 22:09:47 crc kubenswrapper[4834]: I0130 22:09:47.897152 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:47 crc kubenswrapper[4834]: I0130 22:09:47.897699 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:47 crc kubenswrapper[4834]: I0130 22:09:47.954489 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:56 crc kubenswrapper[4834]: I0130 22:09:56.531127 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:09:56 crc kubenswrapper[4834]: E0130 22:09:56.531989 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:09:57 crc kubenswrapper[4834]: I0130 22:09:57.947082 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:57 crc kubenswrapper[4834]: I0130 22:09:57.999663 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzfhp"] Jan 30 22:09:58 crc kubenswrapper[4834]: I0130 22:09:58.443350 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fzfhp" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="registry-server" containerID="cri-o://666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7" gracePeriod=2 Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.046366 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.102060 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24d74\" (UniqueName: \"kubernetes.io/projected/355fa518-4d75-4888-be7d-93d00863cb38-kube-api-access-24d74\") pod \"355fa518-4d75-4888-be7d-93d00863cb38\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.102127 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-utilities\") pod \"355fa518-4d75-4888-be7d-93d00863cb38\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.102457 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-catalog-content\") pod \"355fa518-4d75-4888-be7d-93d00863cb38\" (UID: \"355fa518-4d75-4888-be7d-93d00863cb38\") " Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.102935 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-utilities" (OuterVolumeSpecName: "utilities") pod "355fa518-4d75-4888-be7d-93d00863cb38" (UID: "355fa518-4d75-4888-be7d-93d00863cb38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.103243 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.111557 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355fa518-4d75-4888-be7d-93d00863cb38-kube-api-access-24d74" (OuterVolumeSpecName: "kube-api-access-24d74") pod "355fa518-4d75-4888-be7d-93d00863cb38" (UID: "355fa518-4d75-4888-be7d-93d00863cb38"). InnerVolumeSpecName "kube-api-access-24d74". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.158198 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "355fa518-4d75-4888-be7d-93d00863cb38" (UID: "355fa518-4d75-4888-be7d-93d00863cb38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.205048 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/355fa518-4d75-4888-be7d-93d00863cb38-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.205100 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24d74\" (UniqueName: \"kubernetes.io/projected/355fa518-4d75-4888-be7d-93d00863cb38-kube-api-access-24d74\") on node \"crc\" DevicePath \"\"" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.453522 4834 generic.go:334] "Generic (PLEG): container finished" podID="355fa518-4d75-4888-be7d-93d00863cb38" containerID="666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7" exitCode=0 Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.453577 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fzfhp" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.453576 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerDied","Data":"666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7"} Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.453710 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fzfhp" event={"ID":"355fa518-4d75-4888-be7d-93d00863cb38","Type":"ContainerDied","Data":"55f86900e69a92e8b8b0162d68074a90d3955b1d7463fd93289a7b28a8769b62"} Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.453732 4834 scope.go:117] "RemoveContainer" containerID="666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.474110 4834 scope.go:117] "RemoveContainer" containerID="99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.490766 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fzfhp"] Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.499956 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fzfhp"] Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.519845 4834 scope.go:117] "RemoveContainer" containerID="717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.547228 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355fa518-4d75-4888-be7d-93d00863cb38" path="/var/lib/kubelet/pods/355fa518-4d75-4888-be7d-93d00863cb38/volumes" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.563185 4834 scope.go:117] "RemoveContainer" containerID="666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7" Jan 30 22:09:59 crc kubenswrapper[4834]: E0130 22:09:59.565062 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7\": container with ID starting with 666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7 not found: ID does not exist" containerID="666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.565117 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7"} err="failed to get container status \"666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7\": rpc error: code = NotFound desc = could not find container \"666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7\": container with ID starting with 666fc496ce2774dbbea2b7b8e69203ec8c4392f63ca6b8cbe9314be3068280b7 not found: ID does not exist" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.565149 4834 scope.go:117] "RemoveContainer" containerID="99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af" Jan 30 22:09:59 crc kubenswrapper[4834]: E0130 22:09:59.565647 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af\": container with ID starting with 99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af not found: ID does not exist" containerID="99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.565684 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af"} err="failed to get container status \"99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af\": rpc error: code = NotFound desc = could not find container \"99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af\": container with ID starting with 99917360bf7ef1f0f6d5ae42f24e62826d5f9fed1f353cdf1b9e06b2a20600af not found: ID does not exist" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.565710 4834 scope.go:117] "RemoveContainer" containerID="717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9" Jan 30 22:09:59 crc kubenswrapper[4834]: E0130 22:09:59.566470 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9\": container with ID starting with 717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9 not found: ID does not exist" containerID="717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9" Jan 30 22:09:59 crc kubenswrapper[4834]: I0130 22:09:59.566494 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9"} err="failed to get container status \"717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9\": rpc error: code = NotFound desc = could not find container \"717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9\": container with ID starting with 717070f911e73a2d9dd005460d6ffaeaf06d6edc742fff3611452eae65eef1c9 not found: ID does not exist" Jan 30 22:10:07 crc kubenswrapper[4834]: I0130 22:10:07.531203 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:10:08 crc kubenswrapper[4834]: I0130 22:10:08.544760 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"2e3a5472d260ce3d2f1f21d2d78a8682942d804eb5c8811b48c2375a43798b0c"} Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.176661 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8bpq2"] Jan 30 22:11:06 crc kubenswrapper[4834]: E0130 22:11:06.178209 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="registry-server" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.178243 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="registry-server" Jan 30 22:11:06 crc kubenswrapper[4834]: E0130 22:11:06.178310 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="extract-utilities" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.178329 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="extract-utilities" Jan 30 22:11:06 crc kubenswrapper[4834]: E0130 22:11:06.178377 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="extract-content" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.178428 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="extract-content" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.178928 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="355fa518-4d75-4888-be7d-93d00863cb38" containerName="registry-server" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.193431 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.212271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-utilities\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.212449 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plztf\" (UniqueName: \"kubernetes.io/projected/b0614310-f7bb-4a21-89fa-c3437e3a16e5-kube-api-access-plztf\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.212717 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-catalog-content\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.213548 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bpq2"] Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.315685 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plztf\" (UniqueName: \"kubernetes.io/projected/b0614310-f7bb-4a21-89fa-c3437e3a16e5-kube-api-access-plztf\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.315832 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-catalog-content\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.315901 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-utilities\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.316327 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-utilities\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.316905 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-catalog-content\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.340326 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plztf\" (UniqueName: \"kubernetes.io/projected/b0614310-f7bb-4a21-89fa-c3437e3a16e5-kube-api-access-plztf\") pod \"certified-operators-8bpq2\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.539197 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:06 crc kubenswrapper[4834]: I0130 22:11:06.992102 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8bpq2"] Jan 30 22:11:07 crc kubenswrapper[4834]: I0130 22:11:07.060383 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerStarted","Data":"72e2c65635da6b6f15e6170afa7cdb55f3b05e28e02f833d31e853a5e78978f6"} Jan 30 22:11:08 crc kubenswrapper[4834]: I0130 22:11:08.072041 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerID="0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66" exitCode=0 Jan 30 22:11:08 crc kubenswrapper[4834]: I0130 22:11:08.072091 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerDied","Data":"0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66"} Jan 30 22:11:10 crc kubenswrapper[4834]: I0130 22:11:10.097939 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerStarted","Data":"8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3"} Jan 30 22:11:11 crc kubenswrapper[4834]: I0130 22:11:11.108542 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerID="8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3" exitCode=0 Jan 30 22:11:11 crc kubenswrapper[4834]: I0130 22:11:11.108614 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerDied","Data":"8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3"} Jan 30 22:11:12 crc kubenswrapper[4834]: I0130 22:11:12.119459 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerStarted","Data":"dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56"} Jan 30 22:11:12 crc kubenswrapper[4834]: I0130 22:11:12.141976 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8bpq2" podStartSLOduration=2.720794237 podStartE2EDuration="6.141959105s" podCreationTimestamp="2026-01-30 22:11:06 +0000 UTC" firstStartedPulling="2026-01-30 22:11:08.073630064 +0000 UTC m=+3319.226776202" lastFinishedPulling="2026-01-30 22:11:11.494794922 +0000 UTC m=+3322.647941070" observedRunningTime="2026-01-30 22:11:12.135908764 +0000 UTC m=+3323.289054902" watchObservedRunningTime="2026-01-30 22:11:12.141959105 +0000 UTC m=+3323.295105243" Jan 30 22:11:16 crc kubenswrapper[4834]: I0130 22:11:16.539810 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:16 crc kubenswrapper[4834]: I0130 22:11:16.540164 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:16 crc kubenswrapper[4834]: I0130 22:11:16.603197 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:17 crc kubenswrapper[4834]: I0130 22:11:17.228642 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:17 crc kubenswrapper[4834]: I0130 22:11:17.272636 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bpq2"] Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.196982 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8bpq2" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="registry-server" containerID="cri-o://dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56" gracePeriod=2 Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.782109 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.879581 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-utilities\") pod \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.879805 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-catalog-content\") pod \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.879940 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plztf\" (UniqueName: \"kubernetes.io/projected/b0614310-f7bb-4a21-89fa-c3437e3a16e5-kube-api-access-plztf\") pod \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\" (UID: \"b0614310-f7bb-4a21-89fa-c3437e3a16e5\") " Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.880722 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-utilities" (OuterVolumeSpecName: "utilities") pod "b0614310-f7bb-4a21-89fa-c3437e3a16e5" (UID: "b0614310-f7bb-4a21-89fa-c3437e3a16e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.890734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0614310-f7bb-4a21-89fa-c3437e3a16e5-kube-api-access-plztf" (OuterVolumeSpecName: "kube-api-access-plztf") pod "b0614310-f7bb-4a21-89fa-c3437e3a16e5" (UID: "b0614310-f7bb-4a21-89fa-c3437e3a16e5"). InnerVolumeSpecName "kube-api-access-plztf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.982090 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plztf\" (UniqueName: \"kubernetes.io/projected/b0614310-f7bb-4a21-89fa-c3437e3a16e5-kube-api-access-plztf\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:19 crc kubenswrapper[4834]: I0130 22:11:19.982363 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.208267 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerID="dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56" exitCode=0 Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.208357 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerDied","Data":"dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56"} Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.209124 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8bpq2" event={"ID":"b0614310-f7bb-4a21-89fa-c3437e3a16e5","Type":"ContainerDied","Data":"72e2c65635da6b6f15e6170afa7cdb55f3b05e28e02f833d31e853a5e78978f6"} Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.208473 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8bpq2" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.209157 4834 scope.go:117] "RemoveContainer" containerID="dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.236412 4834 scope.go:117] "RemoveContainer" containerID="8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.256804 4834 scope.go:117] "RemoveContainer" containerID="0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.300789 4834 scope.go:117] "RemoveContainer" containerID="dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.300944 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0614310-f7bb-4a21-89fa-c3437e3a16e5" (UID: "b0614310-f7bb-4a21-89fa-c3437e3a16e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:11:20 crc kubenswrapper[4834]: E0130 22:11:20.301460 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56\": container with ID starting with dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56 not found: ID does not exist" containerID="dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.301583 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56"} err="failed to get container status \"dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56\": rpc error: code = NotFound desc = could not find container \"dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56\": container with ID starting with dcb021b83fb0e573d0f08781023845b65dacd094bc4a695864f77809a4f12c56 not found: ID does not exist" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.301693 4834 scope.go:117] "RemoveContainer" containerID="8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3" Jan 30 22:11:20 crc kubenswrapper[4834]: E0130 22:11:20.302049 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3\": container with ID starting with 8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3 not found: ID does not exist" containerID="8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.302146 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3"} err="failed to get container status \"8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3\": rpc error: code = NotFound desc = could not find container \"8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3\": container with ID starting with 8fe4aea5ca29416310b7eb0c32c8b76501a2ea83da4ee467e971cd67287fd0b3 not found: ID does not exist" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.302216 4834 scope.go:117] "RemoveContainer" containerID="0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66" Jan 30 22:11:20 crc kubenswrapper[4834]: E0130 22:11:20.302855 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66\": container with ID starting with 0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66 not found: ID does not exist" containerID="0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.302887 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66"} err="failed to get container status \"0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66\": rpc error: code = NotFound desc = could not find container \"0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66\": container with ID starting with 0a1cf308aeb67645eec5ed98c4932289520f7fb00d116a13e10867474ff49a66 not found: ID does not exist" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.389014 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0614310-f7bb-4a21-89fa-c3437e3a16e5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.542748 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8bpq2"] Jan 30 22:11:20 crc kubenswrapper[4834]: I0130 22:11:20.553486 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8bpq2"] Jan 30 22:11:21 crc kubenswrapper[4834]: I0130 22:11:21.546305 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" path="/var/lib/kubelet/pods/b0614310-f7bb-4a21-89fa-c3437e3a16e5/volumes" Jan 30 22:11:37 crc kubenswrapper[4834]: I0130 22:11:37.365469 4834 generic.go:334] "Generic (PLEG): container finished" podID="acbc4be9-32f5-471a-b881-578a0b7b715f" containerID="fbbd820c6837b0e532a3f0b63fad3e8b385b78b52304a20feb2cbe746ba8bab4" exitCode=0 Jan 30 22:11:37 crc kubenswrapper[4834]: I0130 22:11:37.365579 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acbc4be9-32f5-471a-b881-578a0b7b715f","Type":"ContainerDied","Data":"fbbd820c6837b0e532a3f0b63fad3e8b385b78b52304a20feb2cbe746ba8bab4"} Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.798108 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.869830 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870103 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-temporary\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870139 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ca-certs\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870207 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdbxw\" (UniqueName: \"kubernetes.io/projected/acbc4be9-32f5-471a-b881-578a0b7b715f-kube-api-access-tdbxw\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870229 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ssh-key\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870296 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-config-data\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870347 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-workdir\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870418 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.870456 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config-secret\") pod \"acbc4be9-32f5-471a-b881-578a0b7b715f\" (UID: \"acbc4be9-32f5-471a-b881-578a0b7b715f\") " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.876466 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-config-data" (OuterVolumeSpecName: "config-data") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.877507 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.878099 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbc4be9-32f5-471a-b881-578a0b7b715f-kube-api-access-tdbxw" (OuterVolumeSpecName: "kube-api-access-tdbxw") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "kube-api-access-tdbxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.885232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.887866 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "test-operator-logs") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.904604 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.911584 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.918505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.936765 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "acbc4be9-32f5-471a-b881-578a0b7b715f" (UID: "acbc4be9-32f5-471a-b881-578a0b7b715f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973151 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-config-data\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973192 4834 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973234 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973245 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973255 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/acbc4be9-32f5-471a-b881-578a0b7b715f-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973264 4834 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/acbc4be9-32f5-471a-b881-578a0b7b715f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973276 4834 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973290 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdbxw\" (UniqueName: \"kubernetes.io/projected/acbc4be9-32f5-471a-b881-578a0b7b715f-kube-api-access-tdbxw\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:38 crc kubenswrapper[4834]: I0130 22:11:38.973303 4834 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acbc4be9-32f5-471a-b881-578a0b7b715f-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:39 crc kubenswrapper[4834]: I0130 22:11:39.004010 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 30 22:11:39 crc kubenswrapper[4834]: I0130 22:11:39.075753 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 30 22:11:39 crc kubenswrapper[4834]: I0130 22:11:39.394327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"acbc4be9-32f5-471a-b881-578a0b7b715f","Type":"ContainerDied","Data":"6fb9dd629921ab700fc4a75ebb9ebd38bfa3f99af5735bc077c85e3ddfe4dbff"} Jan 30 22:11:39 crc kubenswrapper[4834]: I0130 22:11:39.394838 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fb9dd629921ab700fc4a75ebb9ebd38bfa3f99af5735bc077c85e3ddfe4dbff" Jan 30 22:11:39 crc kubenswrapper[4834]: I0130 22:11:39.394473 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.864778 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 22:11:48 crc kubenswrapper[4834]: E0130 22:11:48.865703 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="extract-content" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.865718 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="extract-content" Jan 30 22:11:48 crc kubenswrapper[4834]: E0130 22:11:48.865745 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="extract-utilities" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.865754 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="extract-utilities" Jan 30 22:11:48 crc kubenswrapper[4834]: E0130 22:11:48.865784 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="registry-server" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.865805 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="registry-server" Jan 30 22:11:48 crc kubenswrapper[4834]: E0130 22:11:48.865819 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbc4be9-32f5-471a-b881-578a0b7b715f" containerName="tempest-tests-tempest-tests-runner" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.865828 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbc4be9-32f5-471a-b881-578a0b7b715f" containerName="tempest-tests-tempest-tests-runner" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.866003 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0614310-f7bb-4a21-89fa-c3437e3a16e5" containerName="registry-server" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.866034 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbc4be9-32f5-471a-b881-578a0b7b715f" containerName="tempest-tests-tempest-tests-runner" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.866670 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.868434 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-q6nhs" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.876030 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.983577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:48 crc kubenswrapper[4834]: I0130 22:11:48.983621 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87cg\" (UniqueName: \"kubernetes.io/projected/7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5-kube-api-access-v87cg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.086068 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.086141 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v87cg\" (UniqueName: \"kubernetes.io/projected/7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5-kube-api-access-v87cg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.086587 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.107821 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87cg\" (UniqueName: \"kubernetes.io/projected/7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5-kube-api-access-v87cg\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.131484 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.193268 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 30 22:11:49 crc kubenswrapper[4834]: I0130 22:11:49.671102 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 30 22:11:50 crc kubenswrapper[4834]: I0130 22:11:50.489095 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5","Type":"ContainerStarted","Data":"ee70dffffa05a1756448180d03f4e97b835e028b64f50dc5e909c17155ca7a46"} Jan 30 22:11:51 crc kubenswrapper[4834]: I0130 22:11:51.497468 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5","Type":"ContainerStarted","Data":"bc9661deece9e4a6d69c4803e5af82b677280f74a61118be960e8c95412e790a"} Jan 30 22:11:51 crc kubenswrapper[4834]: I0130 22:11:51.514226 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.711057347 podStartE2EDuration="3.514207342s" podCreationTimestamp="2026-01-30 22:11:48 +0000 UTC" firstStartedPulling="2026-01-30 22:11:49.688264273 +0000 UTC m=+3360.841410441" lastFinishedPulling="2026-01-30 22:11:50.491414298 +0000 UTC m=+3361.644560436" observedRunningTime="2026-01-30 22:11:51.509354285 +0000 UTC m=+3362.662500423" watchObservedRunningTime="2026-01-30 22:11:51.514207342 +0000 UTC m=+3362.667353480" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.506849 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrmdk/must-gather-lcnbh"] Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.538787 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.551354 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zrmdk"/"kube-root-ca.crt" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.551948 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zrmdk"/"openshift-service-ca.crt" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.567496 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zrmdk/must-gather-lcnbh"] Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.702993 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcdxr\" (UniqueName: \"kubernetes.io/projected/9efbd46f-63c9-4934-9888-90224238632c-kube-api-access-gcdxr\") pod \"must-gather-lcnbh\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.703276 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9efbd46f-63c9-4934-9888-90224238632c-must-gather-output\") pod \"must-gather-lcnbh\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.805869 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcdxr\" (UniqueName: \"kubernetes.io/projected/9efbd46f-63c9-4934-9888-90224238632c-kube-api-access-gcdxr\") pod \"must-gather-lcnbh\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.805979 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9efbd46f-63c9-4934-9888-90224238632c-must-gather-output\") pod \"must-gather-lcnbh\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.806524 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9efbd46f-63c9-4934-9888-90224238632c-must-gather-output\") pod \"must-gather-lcnbh\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.832468 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcdxr\" (UniqueName: \"kubernetes.io/projected/9efbd46f-63c9-4934-9888-90224238632c-kube-api-access-gcdxr\") pod \"must-gather-lcnbh\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:31 crc kubenswrapper[4834]: I0130 22:12:31.867521 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:12:32 crc kubenswrapper[4834]: I0130 22:12:32.264722 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zrmdk/must-gather-lcnbh"] Jan 30 22:12:32 crc kubenswrapper[4834]: I0130 22:12:32.270491 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:12:32 crc kubenswrapper[4834]: I0130 22:12:32.918343 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" event={"ID":"9efbd46f-63c9-4934-9888-90224238632c","Type":"ContainerStarted","Data":"3ff8442182a0dcec8ec651e6a2a2726d34a3e6649c99fe2d9b0a3e1a2c2bef4b"} Jan 30 22:12:34 crc kubenswrapper[4834]: I0130 22:12:34.161053 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:12:34 crc kubenswrapper[4834]: I0130 22:12:34.161440 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:12:36 crc kubenswrapper[4834]: I0130 22:12:36.955202 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" event={"ID":"9efbd46f-63c9-4934-9888-90224238632c","Type":"ContainerStarted","Data":"a797184afdb0b9b030db81947817c0c181bc79a00500cfbe6e726a10a161a769"} Jan 30 22:12:36 crc kubenswrapper[4834]: I0130 22:12:36.955778 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" event={"ID":"9efbd46f-63c9-4934-9888-90224238632c","Type":"ContainerStarted","Data":"4871983d95be849e47f13bc398b81733014ba355acb24e0ab9642b47e286994e"} Jan 30 22:12:36 crc kubenswrapper[4834]: I0130 22:12:36.970998 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" podStartSLOduration=1.78448861 podStartE2EDuration="5.970981643s" podCreationTimestamp="2026-01-30 22:12:31 +0000 UTC" firstStartedPulling="2026-01-30 22:12:32.270449403 +0000 UTC m=+3403.423595551" lastFinishedPulling="2026-01-30 22:12:36.456942446 +0000 UTC m=+3407.610088584" observedRunningTime="2026-01-30 22:12:36.970635573 +0000 UTC m=+3408.123781711" watchObservedRunningTime="2026-01-30 22:12:36.970981643 +0000 UTC m=+3408.124127781" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.310359 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-kgkrw"] Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.312092 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.314936 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zrmdk"/"default-dockercfg-xgnv9" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.412172 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njhx5\" (UniqueName: \"kubernetes.io/projected/fc73749b-be05-4be5-93d1-250723f685c0-kube-api-access-njhx5\") pod \"crc-debug-kgkrw\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.412258 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc73749b-be05-4be5-93d1-250723f685c0-host\") pod \"crc-debug-kgkrw\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.514451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njhx5\" (UniqueName: \"kubernetes.io/projected/fc73749b-be05-4be5-93d1-250723f685c0-kube-api-access-njhx5\") pod \"crc-debug-kgkrw\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.514860 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc73749b-be05-4be5-93d1-250723f685c0-host\") pod \"crc-debug-kgkrw\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.514959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc73749b-be05-4be5-93d1-250723f685c0-host\") pod \"crc-debug-kgkrw\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.536881 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njhx5\" (UniqueName: \"kubernetes.io/projected/fc73749b-be05-4be5-93d1-250723f685c0-kube-api-access-njhx5\") pod \"crc-debug-kgkrw\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.629985 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:12:40 crc kubenswrapper[4834]: W0130 22:12:40.663972 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc73749b_be05_4be5_93d1_250723f685c0.slice/crio-5ccf4de6afb042198f943f107aeb0f71710b0553f825df1fa98351f743d06c33 WatchSource:0}: Error finding container 5ccf4de6afb042198f943f107aeb0f71710b0553f825df1fa98351f743d06c33: Status 404 returned error can't find the container with id 5ccf4de6afb042198f943f107aeb0f71710b0553f825df1fa98351f743d06c33 Jan 30 22:12:40 crc kubenswrapper[4834]: I0130 22:12:40.996342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" event={"ID":"fc73749b-be05-4be5-93d1-250723f685c0","Type":"ContainerStarted","Data":"5ccf4de6afb042198f943f107aeb0f71710b0553f825df1fa98351f743d06c33"} Jan 30 22:12:52 crc kubenswrapper[4834]: I0130 22:12:52.106620 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" event={"ID":"fc73749b-be05-4be5-93d1-250723f685c0","Type":"ContainerStarted","Data":"3803a6e160ded6dd62de5f96bb7e0de39beb8c032b4298311bf255d241f164ee"} Jan 30 22:12:52 crc kubenswrapper[4834]: I0130 22:12:52.122627 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" podStartSLOduration=1.304536369 podStartE2EDuration="12.122611301s" podCreationTimestamp="2026-01-30 22:12:40 +0000 UTC" firstStartedPulling="2026-01-30 22:12:40.666547574 +0000 UTC m=+3411.819693722" lastFinishedPulling="2026-01-30 22:12:51.484622516 +0000 UTC m=+3422.637768654" observedRunningTime="2026-01-30 22:12:52.120937824 +0000 UTC m=+3423.274083962" watchObservedRunningTime="2026-01-30 22:12:52.122611301 +0000 UTC m=+3423.275757439" Jan 30 22:13:04 crc kubenswrapper[4834]: I0130 22:13:04.161382 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:13:04 crc kubenswrapper[4834]: I0130 22:13:04.161897 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.160651 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.161072 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.161118 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.161822 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e3a5472d260ce3d2f1f21d2d78a8682942d804eb5c8811b48c2375a43798b0c"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.161863 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://2e3a5472d260ce3d2f1f21d2d78a8682942d804eb5c8811b48c2375a43798b0c" gracePeriod=600 Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.509320 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="2e3a5472d260ce3d2f1f21d2d78a8682942d804eb5c8811b48c2375a43798b0c" exitCode=0 Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.509464 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"2e3a5472d260ce3d2f1f21d2d78a8682942d804eb5c8811b48c2375a43798b0c"} Jan 30 22:13:34 crc kubenswrapper[4834]: I0130 22:13:34.509711 4834 scope.go:117] "RemoveContainer" containerID="f8521cf075e4727826c30de71c168394e455481df2695ecb3019e4bd1943e682" Jan 30 22:13:35 crc kubenswrapper[4834]: I0130 22:13:35.521004 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e"} Jan 30 22:13:37 crc kubenswrapper[4834]: I0130 22:13:37.549013 4834 generic.go:334] "Generic (PLEG): container finished" podID="fc73749b-be05-4be5-93d1-250723f685c0" containerID="3803a6e160ded6dd62de5f96bb7e0de39beb8c032b4298311bf255d241f164ee" exitCode=0 Jan 30 22:13:37 crc kubenswrapper[4834]: I0130 22:13:37.549460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" event={"ID":"fc73749b-be05-4be5-93d1-250723f685c0","Type":"ContainerDied","Data":"3803a6e160ded6dd62de5f96bb7e0de39beb8c032b4298311bf255d241f164ee"} Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.689298 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.724129 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-kgkrw"] Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.733034 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-kgkrw"] Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.791557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc73749b-be05-4be5-93d1-250723f685c0-host\") pod \"fc73749b-be05-4be5-93d1-250723f685c0\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.791718 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc73749b-be05-4be5-93d1-250723f685c0-host" (OuterVolumeSpecName: "host") pod "fc73749b-be05-4be5-93d1-250723f685c0" (UID: "fc73749b-be05-4be5-93d1-250723f685c0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.792296 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njhx5\" (UniqueName: \"kubernetes.io/projected/fc73749b-be05-4be5-93d1-250723f685c0-kube-api-access-njhx5\") pod \"fc73749b-be05-4be5-93d1-250723f685c0\" (UID: \"fc73749b-be05-4be5-93d1-250723f685c0\") " Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.792981 4834 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc73749b-be05-4be5-93d1-250723f685c0-host\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.798448 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc73749b-be05-4be5-93d1-250723f685c0-kube-api-access-njhx5" (OuterVolumeSpecName: "kube-api-access-njhx5") pod "fc73749b-be05-4be5-93d1-250723f685c0" (UID: "fc73749b-be05-4be5-93d1-250723f685c0"). InnerVolumeSpecName "kube-api-access-njhx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:13:38 crc kubenswrapper[4834]: I0130 22:13:38.895253 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njhx5\" (UniqueName: \"kubernetes.io/projected/fc73749b-be05-4be5-93d1-250723f685c0-kube-api-access-njhx5\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.544948 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc73749b-be05-4be5-93d1-250723f685c0" path="/var/lib/kubelet/pods/fc73749b-be05-4be5-93d1-250723f685c0/volumes" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.576177 4834 scope.go:117] "RemoveContainer" containerID="3803a6e160ded6dd62de5f96bb7e0de39beb8c032b4298311bf255d241f164ee" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.576317 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-kgkrw" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.670610 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-np7d4"] Jan 30 22:13:39 crc kubenswrapper[4834]: E0130 22:13:39.671294 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc73749b-be05-4be5-93d1-250723f685c0" containerName="container-00" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.671325 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc73749b-be05-4be5-93d1-250723f685c0" containerName="container-00" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.671602 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc73749b-be05-4be5-93d1-250723f685c0" containerName="container-00" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.675188 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.682720 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-np7d4"] Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.825105 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-catalog-content\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.825162 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-utilities\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.825854 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk878\" (UniqueName: \"kubernetes.io/projected/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-kube-api-access-zk878\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.928206 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk878\" (UniqueName: \"kubernetes.io/projected/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-kube-api-access-zk878\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.928308 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-catalog-content\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.928328 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-utilities\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.928867 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-catalog-content\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.928885 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-utilities\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.947158 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-hxfds"] Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.948325 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.950936 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk878\" (UniqueName: \"kubernetes.io/projected/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-kube-api-access-zk878\") pod \"redhat-operators-np7d4\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:39 crc kubenswrapper[4834]: I0130 22:13:39.952916 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zrmdk"/"default-dockercfg-xgnv9" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.026666 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.030049 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7606b43d-995d-4adc-918a-50153b5e37be-host\") pod \"crc-debug-hxfds\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.030110 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxt5k\" (UniqueName: \"kubernetes.io/projected/7606b43d-995d-4adc-918a-50153b5e37be-kube-api-access-qxt5k\") pod \"crc-debug-hxfds\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.132060 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxt5k\" (UniqueName: \"kubernetes.io/projected/7606b43d-995d-4adc-918a-50153b5e37be-kube-api-access-qxt5k\") pod \"crc-debug-hxfds\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.132796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7606b43d-995d-4adc-918a-50153b5e37be-host\") pod \"crc-debug-hxfds\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.132898 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7606b43d-995d-4adc-918a-50153b5e37be-host\") pod \"crc-debug-hxfds\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.151502 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxt5k\" (UniqueName: \"kubernetes.io/projected/7606b43d-995d-4adc-918a-50153b5e37be-kube-api-access-qxt5k\") pod \"crc-debug-hxfds\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.320362 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.593527 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-np7d4"] Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.594630 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" event={"ID":"7606b43d-995d-4adc-918a-50153b5e37be","Type":"ContainerStarted","Data":"b53cabaec69c300c3f666731473e05fdcc446c271b513e8b6bd134ca90b4f954"} Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.594675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" event={"ID":"7606b43d-995d-4adc-918a-50153b5e37be","Type":"ContainerStarted","Data":"5017f222e98b6c517316c5215e40c65965564a4e8b6ff070f1777bea0c5ec9cb"} Jan 30 22:13:40 crc kubenswrapper[4834]: W0130 22:13:40.602601 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b214c77_6fa3_4922_ba0f_c5e21b2bc0ae.slice/crio-d4ec17fd1aabacf46b23c9fa372287b4d69ad6ffe04d891901a16587e17eebea WatchSource:0}: Error finding container d4ec17fd1aabacf46b23c9fa372287b4d69ad6ffe04d891901a16587e17eebea: Status 404 returned error can't find the container with id d4ec17fd1aabacf46b23c9fa372287b4d69ad6ffe04d891901a16587e17eebea Jan 30 22:13:40 crc kubenswrapper[4834]: I0130 22:13:40.623454 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" podStartSLOduration=1.6234323000000002 podStartE2EDuration="1.6234323s" podCreationTimestamp="2026-01-30 22:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 22:13:40.617228305 +0000 UTC m=+3471.770374443" watchObservedRunningTime="2026-01-30 22:13:40.6234323 +0000 UTC m=+3471.776578438" Jan 30 22:13:41 crc kubenswrapper[4834]: I0130 22:13:41.605631 4834 generic.go:334] "Generic (PLEG): container finished" podID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerID="59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e" exitCode=0 Jan 30 22:13:41 crc kubenswrapper[4834]: I0130 22:13:41.605698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerDied","Data":"59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e"} Jan 30 22:13:41 crc kubenswrapper[4834]: I0130 22:13:41.605987 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerStarted","Data":"d4ec17fd1aabacf46b23c9fa372287b4d69ad6ffe04d891901a16587e17eebea"} Jan 30 22:13:41 crc kubenswrapper[4834]: I0130 22:13:41.609200 4834 generic.go:334] "Generic (PLEG): container finished" podID="7606b43d-995d-4adc-918a-50153b5e37be" containerID="b53cabaec69c300c3f666731473e05fdcc446c271b513e8b6bd134ca90b4f954" exitCode=0 Jan 30 22:13:41 crc kubenswrapper[4834]: I0130 22:13:41.609242 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" event={"ID":"7606b43d-995d-4adc-918a-50153b5e37be","Type":"ContainerDied","Data":"b53cabaec69c300c3f666731473e05fdcc446c271b513e8b6bd134ca90b4f954"} Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.630374 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerStarted","Data":"2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf"} Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.779264 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.813061 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-hxfds"] Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.821675 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-hxfds"] Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.896013 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxt5k\" (UniqueName: \"kubernetes.io/projected/7606b43d-995d-4adc-918a-50153b5e37be-kube-api-access-qxt5k\") pod \"7606b43d-995d-4adc-918a-50153b5e37be\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.896138 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7606b43d-995d-4adc-918a-50153b5e37be-host\") pod \"7606b43d-995d-4adc-918a-50153b5e37be\" (UID: \"7606b43d-995d-4adc-918a-50153b5e37be\") " Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.896308 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7606b43d-995d-4adc-918a-50153b5e37be-host" (OuterVolumeSpecName: "host") pod "7606b43d-995d-4adc-918a-50153b5e37be" (UID: "7606b43d-995d-4adc-918a-50153b5e37be"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.896675 4834 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7606b43d-995d-4adc-918a-50153b5e37be-host\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.901707 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7606b43d-995d-4adc-918a-50153b5e37be-kube-api-access-qxt5k" (OuterVolumeSpecName: "kube-api-access-qxt5k") pod "7606b43d-995d-4adc-918a-50153b5e37be" (UID: "7606b43d-995d-4adc-918a-50153b5e37be"). InnerVolumeSpecName "kube-api-access-qxt5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:13:42 crc kubenswrapper[4834]: I0130 22:13:42.998284 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxt5k\" (UniqueName: \"kubernetes.io/projected/7606b43d-995d-4adc-918a-50153b5e37be-kube-api-access-qxt5k\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.547686 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7606b43d-995d-4adc-918a-50153b5e37be" path="/var/lib/kubelet/pods/7606b43d-995d-4adc-918a-50153b5e37be/volumes" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.641226 4834 generic.go:334] "Generic (PLEG): container finished" podID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerID="2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf" exitCode=0 Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.642446 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerDied","Data":"2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf"} Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.645882 4834 scope.go:117] "RemoveContainer" containerID="b53cabaec69c300c3f666731473e05fdcc446c271b513e8b6bd134ca90b4f954" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.645926 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-hxfds" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.992310 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-g6hr5"] Jan 30 22:13:43 crc kubenswrapper[4834]: E0130 22:13:43.992822 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7606b43d-995d-4adc-918a-50153b5e37be" containerName="container-00" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.992845 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7606b43d-995d-4adc-918a-50153b5e37be" containerName="container-00" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.993081 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7606b43d-995d-4adc-918a-50153b5e37be" containerName="container-00" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.993881 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:43 crc kubenswrapper[4834]: I0130 22:13:43.998979 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zrmdk"/"default-dockercfg-xgnv9" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.122954 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a05f8a-620d-4833-8591-1179e8bd682f-host\") pod \"crc-debug-g6hr5\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.123239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j9gw\" (UniqueName: \"kubernetes.io/projected/99a05f8a-620d-4833-8591-1179e8bd682f-kube-api-access-6j9gw\") pod \"crc-debug-g6hr5\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.225757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a05f8a-620d-4833-8591-1179e8bd682f-host\") pod \"crc-debug-g6hr5\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.225818 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j9gw\" (UniqueName: \"kubernetes.io/projected/99a05f8a-620d-4833-8591-1179e8bd682f-kube-api-access-6j9gw\") pod \"crc-debug-g6hr5\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.225888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a05f8a-620d-4833-8591-1179e8bd682f-host\") pod \"crc-debug-g6hr5\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.251253 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j9gw\" (UniqueName: \"kubernetes.io/projected/99a05f8a-620d-4833-8591-1179e8bd682f-kube-api-access-6j9gw\") pod \"crc-debug-g6hr5\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.320933 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:44 crc kubenswrapper[4834]: W0130 22:13:44.377032 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99a05f8a_620d_4833_8591_1179e8bd682f.slice/crio-669ba155acaf2eb6763276280d498aab9ec7646ff78933bd378f580f46110f2e WatchSource:0}: Error finding container 669ba155acaf2eb6763276280d498aab9ec7646ff78933bd378f580f46110f2e: Status 404 returned error can't find the container with id 669ba155acaf2eb6763276280d498aab9ec7646ff78933bd378f580f46110f2e Jan 30 22:13:44 crc kubenswrapper[4834]: I0130 22:13:44.659090 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" event={"ID":"99a05f8a-620d-4833-8591-1179e8bd682f","Type":"ContainerStarted","Data":"669ba155acaf2eb6763276280d498aab9ec7646ff78933bd378f580f46110f2e"} Jan 30 22:13:45 crc kubenswrapper[4834]: I0130 22:13:45.669378 4834 generic.go:334] "Generic (PLEG): container finished" podID="99a05f8a-620d-4833-8591-1179e8bd682f" containerID="f61d277417e87b388868c91c36ca0378f728bd273f6d9ee7297bc2431becfc76" exitCode=0 Jan 30 22:13:45 crc kubenswrapper[4834]: I0130 22:13:45.669445 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" event={"ID":"99a05f8a-620d-4833-8591-1179e8bd682f","Type":"ContainerDied","Data":"f61d277417e87b388868c91c36ca0378f728bd273f6d9ee7297bc2431becfc76"} Jan 30 22:13:45 crc kubenswrapper[4834]: I0130 22:13:45.672129 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerStarted","Data":"23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6"} Jan 30 22:13:45 crc kubenswrapper[4834]: I0130 22:13:45.715972 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-g6hr5"] Jan 30 22:13:45 crc kubenswrapper[4834]: I0130 22:13:45.726362 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrmdk/crc-debug-g6hr5"] Jan 30 22:13:45 crc kubenswrapper[4834]: I0130 22:13:45.727718 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-np7d4" podStartSLOduration=3.2097407159999998 podStartE2EDuration="6.727697902s" podCreationTimestamp="2026-01-30 22:13:39 +0000 UTC" firstStartedPulling="2026-01-30 22:13:41.608841488 +0000 UTC m=+3472.761987626" lastFinishedPulling="2026-01-30 22:13:45.126798654 +0000 UTC m=+3476.279944812" observedRunningTime="2026-01-30 22:13:45.712808062 +0000 UTC m=+3476.865954210" watchObservedRunningTime="2026-01-30 22:13:45.727697902 +0000 UTC m=+3476.880844040" Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.797541 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.876145 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a05f8a-620d-4833-8591-1179e8bd682f-host\") pod \"99a05f8a-620d-4833-8591-1179e8bd682f\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.876238 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j9gw\" (UniqueName: \"kubernetes.io/projected/99a05f8a-620d-4833-8591-1179e8bd682f-kube-api-access-6j9gw\") pod \"99a05f8a-620d-4833-8591-1179e8bd682f\" (UID: \"99a05f8a-620d-4833-8591-1179e8bd682f\") " Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.877585 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99a05f8a-620d-4833-8591-1179e8bd682f-host" (OuterVolumeSpecName: "host") pod "99a05f8a-620d-4833-8591-1179e8bd682f" (UID: "99a05f8a-620d-4833-8591-1179e8bd682f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.883243 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a05f8a-620d-4833-8591-1179e8bd682f-kube-api-access-6j9gw" (OuterVolumeSpecName: "kube-api-access-6j9gw") pod "99a05f8a-620d-4833-8591-1179e8bd682f" (UID: "99a05f8a-620d-4833-8591-1179e8bd682f"). InnerVolumeSpecName "kube-api-access-6j9gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.978433 4834 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a05f8a-620d-4833-8591-1179e8bd682f-host\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:46 crc kubenswrapper[4834]: I0130 22:13:46.978730 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j9gw\" (UniqueName: \"kubernetes.io/projected/99a05f8a-620d-4833-8591-1179e8bd682f-kube-api-access-6j9gw\") on node \"crc\" DevicePath \"\"" Jan 30 22:13:47 crc kubenswrapper[4834]: I0130 22:13:47.549883 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a05f8a-620d-4833-8591-1179e8bd682f" path="/var/lib/kubelet/pods/99a05f8a-620d-4833-8591-1179e8bd682f/volumes" Jan 30 22:13:47 crc kubenswrapper[4834]: I0130 22:13:47.690774 4834 scope.go:117] "RemoveContainer" containerID="f61d277417e87b388868c91c36ca0378f728bd273f6d9ee7297bc2431becfc76" Jan 30 22:13:47 crc kubenswrapper[4834]: I0130 22:13:47.691122 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/crc-debug-g6hr5" Jan 30 22:13:50 crc kubenswrapper[4834]: I0130 22:13:50.027810 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:50 crc kubenswrapper[4834]: I0130 22:13:50.028410 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:13:51 crc kubenswrapper[4834]: I0130 22:13:51.075131 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-np7d4" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="registry-server" probeResult="failure" output=< Jan 30 22:13:51 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Jan 30 22:13:51 crc kubenswrapper[4834]: > Jan 30 22:14:00 crc kubenswrapper[4834]: I0130 22:14:00.089073 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:14:00 crc kubenswrapper[4834]: I0130 22:14:00.148509 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:14:00 crc kubenswrapper[4834]: I0130 22:14:00.324449 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-np7d4"] Jan 30 22:14:01 crc kubenswrapper[4834]: I0130 22:14:01.607230 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ff95bcc58-fzksr_de560559-c091-48f7-a2d2-f9f0fffcec65/barbican-api/0.log" Jan 30 22:14:01 crc kubenswrapper[4834]: I0130 22:14:01.662281 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7ff95bcc58-fzksr_de560559-c091-48f7-a2d2-f9f0fffcec65/barbican-api-log/0.log" Jan 30 22:14:01 crc kubenswrapper[4834]: I0130 22:14:01.817857 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-np7d4" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="registry-server" containerID="cri-o://23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6" gracePeriod=2 Jan 30 22:14:01 crc kubenswrapper[4834]: I0130 22:14:01.840343 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6dd649c8b-p8wq8_a7d82a62-7407-4012-9f92-8b0abe1afa08/barbican-keystone-listener/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.003878 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6dd649c8b-p8wq8_a7d82a62-7407-4012-9f92-8b0abe1afa08/barbican-keystone-listener-log/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.068021 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fffb48c8c-724zs_e44120e0-1332-4341-8638-917c3f2f1760/barbican-worker-log/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.172140 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fffb48c8c-724zs_e44120e0-1332-4341-8638-917c3f2f1760/barbican-worker/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.287760 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-v6fkk_8d7c7096-9310-450a-8562-4aa5ee7c3b4d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.321174 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.430608 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk878\" (UniqueName: \"kubernetes.io/projected/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-kube-api-access-zk878\") pod \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.430985 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-catalog-content\") pod \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.431068 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-utilities\") pod \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\" (UID: \"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae\") " Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.432433 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-utilities" (OuterVolumeSpecName: "utilities") pod "6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" (UID: "6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.436591 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-kube-api-access-zk878" (OuterVolumeSpecName: "kube-api-access-zk878") pod "6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" (UID: "6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae"). InnerVolumeSpecName "kube-api-access-zk878". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.443887 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cff0c742-7a39-4a5d-91b8-f4b6304b19ef/ceilometer-central-agent/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.472038 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cff0c742-7a39-4a5d-91b8-f4b6304b19ef/ceilometer-notification-agent/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.532821 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.532855 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk878\" (UniqueName: \"kubernetes.io/projected/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-kube-api-access-zk878\") on node \"crc\" DevicePath \"\"" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.546965 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cff0c742-7a39-4a5d-91b8-f4b6304b19ef/proxy-httpd/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.592024 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" (UID: "6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.601673 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cff0c742-7a39-4a5d-91b8-f4b6304b19ef/sg-core/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.634597 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.752695 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d2c94b1f-de81-4b06-ba59-e0466d8cd5c7/cinder-api-log/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.788189 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d2c94b1f-de81-4b06-ba59-e0466d8cd5c7/cinder-api/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.827843 4834 generic.go:334] "Generic (PLEG): container finished" podID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerID="23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6" exitCode=0 Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.827893 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerDied","Data":"23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6"} Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.827926 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-np7d4" event={"ID":"6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae","Type":"ContainerDied","Data":"d4ec17fd1aabacf46b23c9fa372287b4d69ad6ffe04d891901a16587e17eebea"} Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.827949 4834 scope.go:117] "RemoveContainer" containerID="23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.828098 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-np7d4" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.850422 4834 scope.go:117] "RemoveContainer" containerID="2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.880110 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-np7d4"] Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.890879 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-np7d4"] Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.898624 4834 scope.go:117] "RemoveContainer" containerID="59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.926133 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8471f1a0-28a4-4c47-a5a0-77e84eb88e30/cinder-scheduler/0.log" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.958187 4834 scope.go:117] "RemoveContainer" containerID="23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6" Jan 30 22:14:02 crc kubenswrapper[4834]: E0130 22:14:02.959072 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6\": container with ID starting with 23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6 not found: ID does not exist" containerID="23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.959116 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6"} err="failed to get container status \"23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6\": rpc error: code = NotFound desc = could not find container \"23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6\": container with ID starting with 23d6deed3790ecb97c9ab20fa77f15be2853b98f5175358af9aabb8fce5c52b6 not found: ID does not exist" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.959140 4834 scope.go:117] "RemoveContainer" containerID="2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf" Jan 30 22:14:02 crc kubenswrapper[4834]: E0130 22:14:02.959543 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf\": container with ID starting with 2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf not found: ID does not exist" containerID="2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.959568 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf"} err="failed to get container status \"2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf\": rpc error: code = NotFound desc = could not find container \"2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf\": container with ID starting with 2f7a1097af6e984694fed33b07d5ad269d8f1fe12176f44bd7255c65f312aeaf not found: ID does not exist" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.959584 4834 scope.go:117] "RemoveContainer" containerID="59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e" Jan 30 22:14:02 crc kubenswrapper[4834]: E0130 22:14:02.959929 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e\": container with ID starting with 59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e not found: ID does not exist" containerID="59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e" Jan 30 22:14:02 crc kubenswrapper[4834]: I0130 22:14:02.959950 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e"} err="failed to get container status \"59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e\": rpc error: code = NotFound desc = could not find container \"59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e\": container with ID starting with 59879d8ad53dd04b0315e6b4d4d809e6341584de334592ce22d671b68c1b9b6e not found: ID does not exist" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.059464 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8471f1a0-28a4-4c47-a5a0-77e84eb88e30/probe/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.100266 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-px4dt_2ea3b15b-f04b-4690-bb26-e2ec96781265/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.313206 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mfc45_b0d0b92f-1774-4f57-8d8d-df4a1228e2d5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.338685 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-sm2xp_626fd761-10dd-4cb5-9dbb-624ea3e5c525/init/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.521655 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-sm2xp_626fd761-10dd-4cb5-9dbb-624ea3e5c525/init/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.540918 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" path="/var/lib/kubelet/pods/6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae/volumes" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.577761 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lqrlj_31eef1e3-8bc9-4b01-953b-ff70f5082420/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.604643 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-sm2xp_626fd761-10dd-4cb5-9dbb-624ea3e5c525/dnsmasq-dns/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.782991 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_468b9ac4-33a0-4138-8c6e-a83db4ff688d/glance-httpd/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.802498 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_468b9ac4-33a0-4138-8c6e-a83db4ff688d/glance-log/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.913572 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_69203bf2-de86-4d46-873d-1061b074c7c8/glance-httpd/0.log" Jan 30 22:14:03 crc kubenswrapper[4834]: I0130 22:14:03.956798 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_69203bf2-de86-4d46-873d-1061b074c7c8/glance-log/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.016979 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-wx7bv_ee0fe911-9c60-4565-9f57-b3d8efcd1aa3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.198074 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jx7fh_e7629752-891c-46ea-b8ca-4156789f68b3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.446278 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29496841-n62kw_6d5e7575-f4f8-446e-bc52-08c48fe1806c/keystone-cron/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.450326 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5dffc79858-5v9vm_c032c534-05a3-42f0-9d76-8b4c1b317a91/keystone-api/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.611764 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9fbc7d25-3b57-4ad8-af31-35cd316da312/kube-state-metrics/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.692279 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-jp7sl_0ca23ae2-7ce2-414a-8d68-41008397be4a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:04 crc kubenswrapper[4834]: I0130 22:14:04.815202 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-mxjcd_b6e0ba4f-eb3b-4be0-b266-1a274e7e23b5/logging-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:05 crc kubenswrapper[4834]: I0130 22:14:05.196234 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f4bb75569-jlmhp_9f76b7e7-c7af-4251-b083-406a76dd6a7f/neutron-httpd/0.log" Jan 30 22:14:05 crc kubenswrapper[4834]: I0130 22:14:05.261616 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-f4bb75569-jlmhp_9f76b7e7-c7af-4251-b083-406a76dd6a7f/neutron-api/0.log" Jan 30 22:14:05 crc kubenswrapper[4834]: I0130 22:14:05.489221 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-c7xw6_83c863d4-f624-4968-92ff-c6e8bd697115/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:05 crc kubenswrapper[4834]: I0130 22:14:05.906972 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_f58a830a-4400-44c9-be55-758c32d90ac4/nova-cell0-conductor-conductor/0.log" Jan 30 22:14:05 crc kubenswrapper[4834]: I0130 22:14:05.948104 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0/nova-api-log/0.log" Jan 30 22:14:05 crc kubenswrapper[4834]: I0130 22:14:05.968651 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f92f75b4-8d07-4f85-9d0d-2c3d5e4497b0/nova-api-api/0.log" Jan 30 22:14:06 crc kubenswrapper[4834]: I0130 22:14:06.266860 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_255a980b-28cb-4fe1-a9b7-3b504df162a5/nova-cell1-novncproxy-novncproxy/0.log" Jan 30 22:14:06 crc kubenswrapper[4834]: I0130 22:14:06.274582 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_026bbd14-5947-40ec-9c29-1c3153d1cfc2/nova-cell1-conductor-conductor/0.log" Jan 30 22:14:06 crc kubenswrapper[4834]: I0130 22:14:06.485843 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-2xqvc_5f06d94f-e9f1-42b2-9978-b7a128728a60/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:06 crc kubenswrapper[4834]: I0130 22:14:06.608351 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cafa3f48-a62c-47fe-a1a2-d5bc73c1d944/nova-metadata-log/0.log" Jan 30 22:14:06 crc kubenswrapper[4834]: I0130 22:14:06.923012 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b8c57ee1-252e-4288-9af3-18e8bc12d143/nova-scheduler-scheduler/0.log" Jan 30 22:14:06 crc kubenswrapper[4834]: I0130 22:14:06.936996 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_067a1cfb-a1ba-43f7-8669-b233b41cdbd7/mysql-bootstrap/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.125785 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_067a1cfb-a1ba-43f7-8669-b233b41cdbd7/galera/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.166215 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_067a1cfb-a1ba-43f7-8669-b233b41cdbd7/mysql-bootstrap/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.331242 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f917227d-3bb7-4402-9c62-a1ccd41b9782/mysql-bootstrap/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.523673 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f917227d-3bb7-4402-9c62-a1ccd41b9782/mysql-bootstrap/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.543855 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f917227d-3bb7-4402-9c62-a1ccd41b9782/galera/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.634198 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cafa3f48-a62c-47fe-a1a2-d5bc73c1d944/nova-metadata-metadata/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.697149 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_bda10687-cb12-404d-a99f-366f499918ec/openstackclient/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.752566 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-85gcz_73b7f87d-7f27-4150-9542-ccf5985fd8c6/openstack-network-exporter/0.log" Jan 30 22:14:07 crc kubenswrapper[4834]: I0130 22:14:07.929803 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qxr8t_1ddc19c3-1c4a-43e5-9d87-565575ba3ac1/ovsdb-server-init/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.172457 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qxr8t_1ddc19c3-1c4a-43e5-9d87-565575ba3ac1/ovsdb-server/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.219317 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qxr8t_1ddc19c3-1c4a-43e5-9d87-565575ba3ac1/ovs-vswitchd/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.231765 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qxr8t_1ddc19c3-1c4a-43e5-9d87-565575ba3ac1/ovsdb-server-init/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.420594 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-t2k4l_493ce910-9c99-49f5-85eb-3917715c87b6/ovn-controller/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.556772 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hljjc_112c7019-6e0b-4366-b959-e11750f43a26/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.673217 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_795d7189-87c6-410a-bd1e-aecf055a1719/openstack-network-exporter/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.700133 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_795d7189-87c6-410a-bd1e-aecf055a1719/ovn-northd/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.811719 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63838846-58d0-40d4-a246-7f7cc5012673/openstack-network-exporter/0.log" Jan 30 22:14:08 crc kubenswrapper[4834]: I0130 22:14:08.915908 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_63838846-58d0-40d4-a246-7f7cc5012673/ovsdbserver-nb/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.020734 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0cc2729f-00c7-4137-b6a1-9dfd0e01c60a/openstack-network-exporter/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.133856 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0cc2729f-00c7-4137-b6a1-9dfd0e01c60a/ovsdbserver-sb/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.236177 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65cd556484-hkshq_316f606b-e690-43aa-afb0-d8643180d92a/placement-api/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.385897 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65cd556484-hkshq_316f606b-e690-43aa-afb0-d8643180d92a/placement-log/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.426614 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_927e5578-7c09-4caf-ab81-0e8229f8aef0/setup-container/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.663527 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_927e5578-7c09-4caf-ab81-0e8229f8aef0/setup-container/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.674289 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e9b522ed-a619-4a0c-99dd-9f14c679b469/setup-container/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.693414 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_927e5578-7c09-4caf-ab81-0e8229f8aef0/rabbitmq/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.883732 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e9b522ed-a619-4a0c-99dd-9f14c679b469/setup-container/0.log" Jan 30 22:14:09 crc kubenswrapper[4834]: I0130 22:14:09.909986 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e9b522ed-a619-4a0c-99dd-9f14c679b469/rabbitmq/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.009822 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-j9j2s_a0352665-2322-46ae-b019-d20ccc580414/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.138295 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-xpqzs_e9533467-64e4-405d-9086-94b32f633d20/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.253484 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ghgch_a461fd28-a3f9-469a-b3b9-81abf54bf0c6/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.505562 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-42nbl_1309ebf1-3a18-4898-8137-e3658586a506/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.509127 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-zwv5h_dda27b9c-c83f-4ba9-a4e3-59ea0e3b336d/ssh-known-hosts-edpm-deployment/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.668222 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_af27dc41-8d8b-4471-8481-ca32766a9344/memcached/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.746485 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5bb7f7bbcf-bjdrn_c8a480b4-e084-4ea8-b438-6a3c217b4514/proxy-server/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.821300 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5bb7f7bbcf-bjdrn_c8a480b4-e084-4ea8-b438-6a3c217b4514/proxy-httpd/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.879260 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bv5d8_9f691cdf-c7e6-4083-a7ef-b0a9e1c10feb/swift-ring-rebalance/0.log" Jan 30 22:14:10 crc kubenswrapper[4834]: I0130 22:14:10.981771 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/account-auditor/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.044541 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/account-replicator/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.053708 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/account-reaper/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.122748 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/container-auditor/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.146458 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/account-server/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.226191 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/container-replicator/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.238825 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/container-updater/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.286187 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/container-server/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.343487 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/object-expirer/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.395243 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/object-auditor/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.418289 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/object-replicator/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.455559 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/object-server/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.476504 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/object-updater/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.609749 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/swift-recon-cron/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.622023 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_070baa9f-0897-4fe2-bc14-68a831d81dce/rsync/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.746491 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-8tzzj_dda5e26c-7c67-4aa6-9dea-dde313cbb7d7/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.802973 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_acbc4be9-32f5-471a-b881-578a0b7b715f/tempest-tests-tempest-tests-runner/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.848948 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_7dd5ecf4-90f5-4455-9ea2-8afd54c8d9f5/test-operator-logs-container/0.log" Jan 30 22:14:11 crc kubenswrapper[4834]: I0130 22:14:11.950493 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7qblf_51e87d5c-a737-4250-b885-3d7ebbd2c803/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.461149 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-95x2s_2156cb3c-172b-4268-86e6-64b1d40b87ed/manager/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.595803 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/util/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.742752 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/pull/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.755710 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/util/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.778813 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/pull/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.940696 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/util/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.955777 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/extract/0.log" Jan 30 22:14:33 crc kubenswrapper[4834]: I0130 22:14:33.965913 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c2193e9bc444b90e210f090c3afd8986af29f27f24ea0af818bf2b2bfcvbzr5_fbe6ba8c-1c28-4778-ba40-bb22671864ed/pull/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.137707 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-hjdtw_2e538c79-bca6-46f0-a63d-fb537639f206/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.159153 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-4crlf_6371c2f9-d19b-4b87-b0db-ba05d48ea5fb/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.368315 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-x8rnk_9c50c180-8d91-43d0-bb07-4ea3881a1751/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.388454 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-l2pv5_88accce3-ac33-420c-aa10-b7fea0b498c3/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.629302 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-pkzb7_afece73a-c2b2-4905-819f-e8c73d968968/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.787429 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-xqpmp_cc123e31-1ac7-4e8e-a5d8-b9671d0cfe73/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.796724 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-6rdsk_b0b534e5-be84-4fd0-a8f6-ee233988095e/manager/0.log" Jan 30 22:14:34 crc kubenswrapper[4834]: I0130 22:14:34.970099 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-n4kck_ec5f427f-4be8-4066-a817-c9e2e3df4e6f/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.012856 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-ccd87_d6aef56e-f8a3-4400-b28f-5bd40a323c73/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.190106 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-vc2wt_d68f882f-c07c-4022-a6fa-f4814f313870/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.228860 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-lllql_e22f9414-3441-42d6-adde-8629c168c055/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.438307 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-xrvft_b8c28d5a-26d3-4fdc-aa0e-587e93dfa5b6/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.474155 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-889p5_eea6da10-7c27-42c7-a532-f872f8b7c86a/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.681600 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4d2qn6x_1ba55708-11fd-4a17-9a95-88fd28711fb6/manager/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.758460 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-55fdcd6c79-cwbsn_e28ad8de-912f-418e-9706-56f0dd055527/operator/0.log" Jan 30 22:14:35 crc kubenswrapper[4834]: I0130 22:14:35.963199 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cchsg_7337253d-358a-4bdc-8f28-0a0aad4afe6b/registry-server/0.log" Jan 30 22:14:36 crc kubenswrapper[4834]: I0130 22:14:36.213884 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-phxxg_1855b8a8-7a5e-4516-9bc6-156c6bb52068/manager/0.log" Jan 30 22:14:36 crc kubenswrapper[4834]: I0130 22:14:36.299490 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-45npx_4d4e0469-0167-410e-9bfd-26f81b9900cf/manager/0.log" Jan 30 22:14:36 crc kubenswrapper[4834]: I0130 22:14:36.507978 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gxzqj_b998e309-c037-436c-aed4-12298af019ac/operator/0.log" Jan 30 22:14:36 crc kubenswrapper[4834]: I0130 22:14:36.735972 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-fgvhj_2150b962-b815-4dec-ac4c-468aad4dc16c/manager/0.log" Jan 30 22:14:36 crc kubenswrapper[4834]: I0130 22:14:36.810550 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6749767b8f-kk9tb_9c4aa5d9-5f04-43a6-92d5-8258862556d2/manager/0.log" Jan 30 22:14:36 crc kubenswrapper[4834]: I0130 22:14:36.980946 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-x4j85_9306fcee-b55f-488a-bc53-1a809c9f20e9/manager/0.log" Jan 30 22:14:37 crc kubenswrapper[4834]: I0130 22:14:37.073797 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d48698d88-q72fc_a9897e6e-a451-4b52-9135-dca4af64fbfb/manager/0.log" Jan 30 22:14:37 crc kubenswrapper[4834]: I0130 22:14:37.177577 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-zzf9h_570acfca-9a4a-403d-a421-b339b31def95/manager/0.log" Jan 30 22:14:56 crc kubenswrapper[4834]: I0130 22:14:56.089883 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xqnhs_011bab8f-7841-4e99-8d47-ee7ed71b9ec5/control-plane-machine-set-operator/0.log" Jan 30 22:14:56 crc kubenswrapper[4834]: I0130 22:14:56.200183 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5qhdb_00580dd0-6712-4cd5-b651-a200271e0727/kube-rbac-proxy/0.log" Jan 30 22:14:56 crc kubenswrapper[4834]: I0130 22:14:56.297882 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5qhdb_00580dd0-6712-4cd5-b651-a200271e0727/machine-api-operator/0.log" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.179749 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf"] Jan 30 22:15:00 crc kubenswrapper[4834]: E0130 22:15:00.180848 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.180872 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4834]: E0130 22:15:00.180895 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a05f8a-620d-4833-8591-1179e8bd682f" containerName="container-00" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.180905 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a05f8a-620d-4833-8591-1179e8bd682f" containerName="container-00" Jan 30 22:15:00 crc kubenswrapper[4834]: E0130 22:15:00.180930 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.180939 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="extract-content" Jan 30 22:15:00 crc kubenswrapper[4834]: E0130 22:15:00.180961 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.180970 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="extract-utilities" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.181278 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b214c77-6fa3-4922-ba0f-c5e21b2bc0ae" containerName="registry-server" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.181292 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a05f8a-620d-4833-8591-1179e8bd682f" containerName="container-00" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.182209 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.185321 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.192966 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf"] Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.193533 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.223729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd822dc7-079a-4db1-9559-1bb96c6b05f8-secret-volume\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.223786 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd822dc7-079a-4db1-9559-1bb96c6b05f8-config-volume\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.223912 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrdf\" (UniqueName: \"kubernetes.io/projected/cd822dc7-079a-4db1-9559-1bb96c6b05f8-kube-api-access-wgrdf\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.325576 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrdf\" (UniqueName: \"kubernetes.io/projected/cd822dc7-079a-4db1-9559-1bb96c6b05f8-kube-api-access-wgrdf\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.325717 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd822dc7-079a-4db1-9559-1bb96c6b05f8-secret-volume\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.325746 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd822dc7-079a-4db1-9559-1bb96c6b05f8-config-volume\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.326615 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd822dc7-079a-4db1-9559-1bb96c6b05f8-config-volume\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.331863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd822dc7-079a-4db1-9559-1bb96c6b05f8-secret-volume\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.343111 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrdf\" (UniqueName: \"kubernetes.io/projected/cd822dc7-079a-4db1-9559-1bb96c6b05f8-kube-api-access-wgrdf\") pod \"collect-profiles-29496855-vcbgf\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:00 crc kubenswrapper[4834]: I0130 22:15:00.512523 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:01 crc kubenswrapper[4834]: I0130 22:15:01.046192 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf"] Jan 30 22:15:01 crc kubenswrapper[4834]: I0130 22:15:01.360935 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" event={"ID":"cd822dc7-079a-4db1-9559-1bb96c6b05f8","Type":"ContainerStarted","Data":"897120136a42464b69f289a8691003d57f9623f315e8724a352856f5d409e5d5"} Jan 30 22:15:01 crc kubenswrapper[4834]: I0130 22:15:01.361311 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" event={"ID":"cd822dc7-079a-4db1-9559-1bb96c6b05f8","Type":"ContainerStarted","Data":"611fe5916e2c1507771e4248a36d003e5360a2dad835fbce1e7d9ae35a881766"} Jan 30 22:15:01 crc kubenswrapper[4834]: I0130 22:15:01.386382 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" podStartSLOduration=1.38636584 podStartE2EDuration="1.38636584s" podCreationTimestamp="2026-01-30 22:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-30 22:15:01.38174706 +0000 UTC m=+3552.534893198" watchObservedRunningTime="2026-01-30 22:15:01.38636584 +0000 UTC m=+3552.539511968" Jan 30 22:15:02 crc kubenswrapper[4834]: I0130 22:15:02.370919 4834 generic.go:334] "Generic (PLEG): container finished" podID="cd822dc7-079a-4db1-9559-1bb96c6b05f8" containerID="897120136a42464b69f289a8691003d57f9623f315e8724a352856f5d409e5d5" exitCode=0 Jan 30 22:15:02 crc kubenswrapper[4834]: I0130 22:15:02.370974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" event={"ID":"cd822dc7-079a-4db1-9559-1bb96c6b05f8","Type":"ContainerDied","Data":"897120136a42464b69f289a8691003d57f9623f315e8724a352856f5d409e5d5"} Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.818989 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.930110 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd822dc7-079a-4db1-9559-1bb96c6b05f8-config-volume\") pod \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.930371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgrdf\" (UniqueName: \"kubernetes.io/projected/cd822dc7-079a-4db1-9559-1bb96c6b05f8-kube-api-access-wgrdf\") pod \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.930585 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd822dc7-079a-4db1-9559-1bb96c6b05f8-secret-volume\") pod \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\" (UID: \"cd822dc7-079a-4db1-9559-1bb96c6b05f8\") " Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.931122 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd822dc7-079a-4db1-9559-1bb96c6b05f8-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd822dc7-079a-4db1-9559-1bb96c6b05f8" (UID: "cd822dc7-079a-4db1-9559-1bb96c6b05f8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.932549 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd822dc7-079a-4db1-9559-1bb96c6b05f8-config-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.937426 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd822dc7-079a-4db1-9559-1bb96c6b05f8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cd822dc7-079a-4db1-9559-1bb96c6b05f8" (UID: "cd822dc7-079a-4db1-9559-1bb96c6b05f8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 30 22:15:03 crc kubenswrapper[4834]: I0130 22:15:03.938042 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd822dc7-079a-4db1-9559-1bb96c6b05f8-kube-api-access-wgrdf" (OuterVolumeSpecName: "kube-api-access-wgrdf") pod "cd822dc7-079a-4db1-9559-1bb96c6b05f8" (UID: "cd822dc7-079a-4db1-9559-1bb96c6b05f8"). InnerVolumeSpecName "kube-api-access-wgrdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.035150 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgrdf\" (UniqueName: \"kubernetes.io/projected/cd822dc7-079a-4db1-9559-1bb96c6b05f8-kube-api-access-wgrdf\") on node \"crc\" DevicePath \"\"" Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.035203 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd822dc7-079a-4db1-9559-1bb96c6b05f8-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.391931 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" event={"ID":"cd822dc7-079a-4db1-9559-1bb96c6b05f8","Type":"ContainerDied","Data":"611fe5916e2c1507771e4248a36d003e5360a2dad835fbce1e7d9ae35a881766"} Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.391971 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611fe5916e2c1507771e4248a36d003e5360a2dad835fbce1e7d9ae35a881766" Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.391983 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29496855-vcbgf" Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.469932 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh"] Jan 30 22:15:04 crc kubenswrapper[4834]: I0130 22:15:04.479537 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29496810-vtkrh"] Jan 30 22:15:05 crc kubenswrapper[4834]: I0130 22:15:05.546849 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2197ac71-9c5a-483c-9944-518bd37b0583" path="/var/lib/kubelet/pods/2197ac71-9c5a-483c-9944-518bd37b0583/volumes" Jan 30 22:15:10 crc kubenswrapper[4834]: I0130 22:15:10.127111 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vsrc9_496f8c34-1261-4c09-9e7e-fa69c23cca44/cert-manager-controller/0.log" Jan 30 22:15:10 crc kubenswrapper[4834]: I0130 22:15:10.309774 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-b4mhg_9d83df3b-5c19-4bed-9a40-06f23afde5a9/cert-manager-cainjector/0.log" Jan 30 22:15:10 crc kubenswrapper[4834]: I0130 22:15:10.390925 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-z5xqc_1a3c27a9-b6a9-4971-b250-6b34d6528e5a/cert-manager-webhook/0.log" Jan 30 22:15:24 crc kubenswrapper[4834]: I0130 22:15:24.423659 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-4kdm4_b1cebcb5-34d7-4e5d-b5bb-569ed874a27c/nmstate-console-plugin/0.log" Jan 30 22:15:24 crc kubenswrapper[4834]: I0130 22:15:24.604741 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-627sx_cff9b6ad-5a56-4911-876a-51c7a25619c4/nmstate-handler/0.log" Jan 30 22:15:24 crc kubenswrapper[4834]: I0130 22:15:24.670191 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-tx5rx_acb1f436-a969-4b96-a54a-0228575c680b/nmstate-metrics/0.log" Jan 30 22:15:24 crc kubenswrapper[4834]: I0130 22:15:24.670372 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-tx5rx_acb1f436-a969-4b96-a54a-0228575c680b/kube-rbac-proxy/0.log" Jan 30 22:15:24 crc kubenswrapper[4834]: I0130 22:15:24.815642 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-94zx2_acad5879-77cc-49f6-89c4-ccc5e97b9c4e/nmstate-operator/0.log" Jan 30 22:15:24 crc kubenswrapper[4834]: I0130 22:15:24.868499 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-frjh4_e7699088-70c6-4994-ad05-6ae59420798c/nmstate-webhook/0.log" Jan 30 22:15:34 crc kubenswrapper[4834]: I0130 22:15:34.161634 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:15:34 crc kubenswrapper[4834]: I0130 22:15:34.162139 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:15:38 crc kubenswrapper[4834]: I0130 22:15:38.107689 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56cf686fd5-j4wjx_539234c0-ea70-4188-b1d2-e5b758c78563/kube-rbac-proxy/0.log" Jan 30 22:15:38 crc kubenswrapper[4834]: I0130 22:15:38.159863 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56cf686fd5-j4wjx_539234c0-ea70-4188-b1d2-e5b758c78563/manager/0.log" Jan 30 22:16:01 crc kubenswrapper[4834]: I0130 22:16:01.096949 4834 scope.go:117] "RemoveContainer" containerID="e0ef5120563536268b094d4c5b997bd2a18d963a578fa1a50eeded6edc98285c" Jan 30 22:16:04 crc kubenswrapper[4834]: I0130 22:16:04.161459 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:16:04 crc kubenswrapper[4834]: I0130 22:16:04.162021 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:16:05 crc kubenswrapper[4834]: I0130 22:16:05.856584 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-79cf69ddc8-4ttdj_8e8ca377-06ad-4b0a-9fde-e0e73f92f3ae/cluster-logging-operator/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.192758 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-zjpxd_854d0b76-06df-4a28-ad4e-b50396ef3248/collector/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.256680 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_21cb37c2-74d7-4840-9248-4330f12ead7a/loki-compactor/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.398591 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5f678c8dd6-7m47l_3470b03a-ff5f-4654-b8f4-db3ee90be448/loki-distributor/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.483909 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d9fb787f7-m7ft4_a0779eb6-e6eb-41f3-8c01-8072ad63eedd/gateway/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.557118 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d9fb787f7-m7ft4_a0779eb6-e6eb-41f3-8c01-8072ad63eedd/opa/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.634379 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d9fb787f7-rkvxh_2ed32950-7326-4344-bcdb-7843ca0162e1/gateway/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.690383 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d9fb787f7-rkvxh_2ed32950-7326-4344-bcdb-7843ca0162e1/opa/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.807306 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_d6def372-844d-4aa5-9499-915742a71d36/loki-index-gateway/0.log" Jan 30 22:16:06 crc kubenswrapper[4834]: I0130 22:16:06.902663 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_e0bb0cb6-3429-4d3e-ac99-162bb485aa1b/loki-ingester/0.log" Jan 30 22:16:07 crc kubenswrapper[4834]: I0130 22:16:07.010971 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76788598db-tcqxt_5fc9974d-5ec1-42b1-a557-2601e6168fa1/loki-querier/0.log" Jan 30 22:16:07 crc kubenswrapper[4834]: I0130 22:16:07.099802 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-69d9546745-tcdrc_032e9188-65b1-4456-9879-518958f9c1e7/loki-query-frontend/0.log" Jan 30 22:16:21 crc kubenswrapper[4834]: I0130 22:16:21.728647 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gmxj2_aaffa237-4e73-45f7-9a04-e0cd97abc541/kube-rbac-proxy/0.log" Jan 30 22:16:21 crc kubenswrapper[4834]: I0130 22:16:21.809878 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-gmxj2_aaffa237-4e73-45f7-9a04-e0cd97abc541/controller/0.log" Jan 30 22:16:21 crc kubenswrapper[4834]: I0130 22:16:21.926445 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-frr-files/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.124328 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-metrics/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.139262 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-frr-files/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.163866 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-reloader/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.186992 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-reloader/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.421021 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-metrics/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.441454 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-frr-files/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.450021 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-reloader/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.462615 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-metrics/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.610014 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-reloader/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.629990 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-frr-files/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.705873 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/controller/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.708445 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/cp-metrics/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.839175 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/frr-metrics/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.968958 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/kube-rbac-proxy-frr/0.log" Jan 30 22:16:22 crc kubenswrapper[4834]: I0130 22:16:22.975507 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/kube-rbac-proxy/0.log" Jan 30 22:16:23 crc kubenswrapper[4834]: I0130 22:16:23.122102 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/reloader/0.log" Jan 30 22:16:23 crc kubenswrapper[4834]: I0130 22:16:23.280032 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-kj45j_df5e4117-7286-4f0b-8ef7-d4e2b3d2eed8/frr-k8s-webhook-server/0.log" Jan 30 22:16:23 crc kubenswrapper[4834]: I0130 22:16:23.501715 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6dfb4f7bb8-zhhz4_db5dfdf6-696e-40f9-95a6-baa1b909a02e/manager/0.log" Jan 30 22:16:23 crc kubenswrapper[4834]: I0130 22:16:23.601235 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c67b9b9df-6cd7q_cce27a06-72ce-4b87-be28-e71501ec9291/webhook-server/0.log" Jan 30 22:16:23 crc kubenswrapper[4834]: I0130 22:16:23.831714 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dlj42_2650188e-fece-45c8-a478-c8ea2ec1552d/kube-rbac-proxy/0.log" Jan 30 22:16:23 crc kubenswrapper[4834]: I0130 22:16:23.987907 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xr9z_ff4a3ae3-827a-4874-9d7e-0be7bb7548a3/frr/0.log" Jan 30 22:16:24 crc kubenswrapper[4834]: I0130 22:16:24.129715 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dlj42_2650188e-fece-45c8-a478-c8ea2ec1552d/speaker/0.log" Jan 30 22:16:34 crc kubenswrapper[4834]: I0130 22:16:34.161120 4834 patch_prober.go:28] interesting pod/machine-config-daemon-drghn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 30 22:16:34 crc kubenswrapper[4834]: I0130 22:16:34.161698 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 30 22:16:34 crc kubenswrapper[4834]: I0130 22:16:34.161744 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drghn" Jan 30 22:16:34 crc kubenswrapper[4834]: I0130 22:16:34.162515 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e"} pod="openshift-machine-config-operator/machine-config-daemon-drghn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 30 22:16:34 crc kubenswrapper[4834]: I0130 22:16:34.162568 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerName="machine-config-daemon" containerID="cri-o://a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" gracePeriod=600 Jan 30 22:16:34 crc kubenswrapper[4834]: E0130 22:16:34.292875 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:16:35 crc kubenswrapper[4834]: I0130 22:16:35.314284 4834 generic.go:334] "Generic (PLEG): container finished" podID="296cf2a5-374e-4730-9d40-8abb93c8e237" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" exitCode=0 Jan 30 22:16:35 crc kubenswrapper[4834]: I0130 22:16:35.314336 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerDied","Data":"a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e"} Jan 30 22:16:35 crc kubenswrapper[4834]: I0130 22:16:35.314382 4834 scope.go:117] "RemoveContainer" containerID="2e3a5472d260ce3d2f1f21d2d78a8682942d804eb5c8811b48c2375a43798b0c" Jan 30 22:16:35 crc kubenswrapper[4834]: I0130 22:16:35.315173 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:16:35 crc kubenswrapper[4834]: E0130 22:16:35.315743 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.017543 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/util/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.207724 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/util/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.227055 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/pull/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.263110 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/pull/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.607009 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/extract/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.644059 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/pull/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.687319 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xmk95_9f92c84d-9cef-44b9-a0c5-61e83cfbdf79/util/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.784194 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/util/0.log" Jan 30 22:16:37 crc kubenswrapper[4834]: I0130 22:16:37.980223 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/util/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.017416 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/pull/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.017586 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/pull/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.153385 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/util/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.196183 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/pull/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.197426 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dct58dc_027a0611-c347-4644-ba70-7f12f2f9a344/extract/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.353597 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/util/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.515691 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/util/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.534788 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/pull/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.540714 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/pull/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.706358 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/util/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.708646 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/extract/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.735968 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bks_4b974467-c941-4dd3-86f1-e9757bce2972/pull/0.log" Jan 30 22:16:38 crc kubenswrapper[4834]: I0130 22:16:38.870262 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/util/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.081102 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/util/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.090483 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/pull/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.105527 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/pull/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.317444 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/pull/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.347427 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/util/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.348406 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713dg2qg_390635a4-5536-4897-b656-587cf2dbf6dc/extract/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.508864 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/extract-utilities/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.667114 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/extract-utilities/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.711901 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/extract-content/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.716785 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/extract-content/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.914447 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/extract-utilities/0.log" Jan 30 22:16:39 crc kubenswrapper[4834]: I0130 22:16:39.942109 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/extract-content/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.127599 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/extract-utilities/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.375181 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/extract-utilities/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.409677 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/extract-content/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.444181 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ggts_a2ecf74e-0aec-464c-b7e7-11670319f04c/registry-server/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.505797 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/extract-content/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.618226 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/extract-utilities/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.705558 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/extract-content/0.log" Jan 30 22:16:40 crc kubenswrapper[4834]: I0130 22:16:40.891672 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lq5l9_92a7c64f-4f7c-473c-94cd-3ec4e3ae546e/marketplace-operator/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.040622 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/extract-utilities/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.146021 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-49knk_7d29ebe5-230e-468b-8344-bdfa02c88095/registry-server/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.270136 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/extract-content/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.282279 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/extract-utilities/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.286294 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/extract-content/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.457808 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/extract-utilities/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.469525 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/extract-content/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.516531 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/extract-utilities/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.654647 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2k5hc_ae12279c-fd14-40dd-a3bd-c3d5ceaac331/registry-server/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.720499 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/extract-content/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.721144 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/extract-content/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.725921 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/extract-utilities/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.943026 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/extract-content/0.log" Jan 30 22:16:41 crc kubenswrapper[4834]: I0130 22:16:41.958363 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/extract-utilities/0.log" Jan 30 22:16:42 crc kubenswrapper[4834]: I0130 22:16:42.339999 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r7dp9_e8c12556-80ec-42b1-9d47-ead9224f86ff/registry-server/0.log" Jan 30 22:16:48 crc kubenswrapper[4834]: I0130 22:16:48.530804 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:16:48 crc kubenswrapper[4834]: E0130 22:16:48.531759 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:16:59 crc kubenswrapper[4834]: I0130 22:16:59.542884 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:16:59 crc kubenswrapper[4834]: E0130 22:16:59.543862 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:17:06 crc kubenswrapper[4834]: I0130 22:17:06.904196 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56cf686fd5-j4wjx_539234c0-ea70-4188-b1d2-e5b758c78563/kube-rbac-proxy/0.log" Jan 30 22:17:07 crc kubenswrapper[4834]: I0130 22:17:07.005058 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-56cf686fd5-j4wjx_539234c0-ea70-4188-b1d2-e5b758c78563/manager/0.log" Jan 30 22:17:12 crc kubenswrapper[4834]: I0130 22:17:12.533038 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:17:12 crc kubenswrapper[4834]: E0130 22:17:12.534279 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:17:16 crc kubenswrapper[4834]: E0130 22:17:16.636617 4834 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.106:48828->38.102.83.106:45169: write tcp 38.102.83.106:48828->38.102.83.106:45169: write: broken pipe Jan 30 22:17:27 crc kubenswrapper[4834]: I0130 22:17:27.541714 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:17:27 crc kubenswrapper[4834]: E0130 22:17:27.542520 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:17:42 crc kubenswrapper[4834]: I0130 22:17:42.531879 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:17:42 crc kubenswrapper[4834]: E0130 22:17:42.532756 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:17:53 crc kubenswrapper[4834]: I0130 22:17:53.533138 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:17:53 crc kubenswrapper[4834]: E0130 22:17:53.534451 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:18:05 crc kubenswrapper[4834]: I0130 22:18:05.532226 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:18:05 crc kubenswrapper[4834]: E0130 22:18:05.533059 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:18:19 crc kubenswrapper[4834]: I0130 22:18:19.539804 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:18:19 crc kubenswrapper[4834]: E0130 22:18:19.540543 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.531825 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:18:31 crc kubenswrapper[4834]: E0130 22:18:31.533992 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.712907 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b455c"] Jan 30 22:18:31 crc kubenswrapper[4834]: E0130 22:18:31.713941 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd822dc7-079a-4db1-9559-1bb96c6b05f8" containerName="collect-profiles" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.713964 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd822dc7-079a-4db1-9559-1bb96c6b05f8" containerName="collect-profiles" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.714274 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd822dc7-079a-4db1-9559-1bb96c6b05f8" containerName="collect-profiles" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.719507 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.728156 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b455c"] Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.888145 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-utilities\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.888572 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-catalog-content\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.888627 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8hw9\" (UniqueName: \"kubernetes.io/projected/e6968481-0d80-4ecc-962c-136afc005858-kube-api-access-n8hw9\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.990517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-utilities\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.990609 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-catalog-content\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.990645 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8hw9\" (UniqueName: \"kubernetes.io/projected/e6968481-0d80-4ecc-962c-136afc005858-kube-api-access-n8hw9\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.991369 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-utilities\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:31 crc kubenswrapper[4834]: I0130 22:18:31.991464 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-catalog-content\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:32 crc kubenswrapper[4834]: I0130 22:18:32.012734 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8hw9\" (UniqueName: \"kubernetes.io/projected/e6968481-0d80-4ecc-962c-136afc005858-kube-api-access-n8hw9\") pod \"redhat-marketplace-b455c\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:32 crc kubenswrapper[4834]: I0130 22:18:32.098029 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:32 crc kubenswrapper[4834]: I0130 22:18:32.600903 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b455c"] Jan 30 22:18:33 crc kubenswrapper[4834]: I0130 22:18:33.452211 4834 generic.go:334] "Generic (PLEG): container finished" podID="e6968481-0d80-4ecc-962c-136afc005858" containerID="432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a" exitCode=0 Jan 30 22:18:33 crc kubenswrapper[4834]: I0130 22:18:33.452264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b455c" event={"ID":"e6968481-0d80-4ecc-962c-136afc005858","Type":"ContainerDied","Data":"432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a"} Jan 30 22:18:33 crc kubenswrapper[4834]: I0130 22:18:33.452854 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b455c" event={"ID":"e6968481-0d80-4ecc-962c-136afc005858","Type":"ContainerStarted","Data":"f832a98bbc879087aaabfc47c99fc4c8e64a9ab2015927ba7550c1eda4eeb99b"} Jan 30 22:18:33 crc kubenswrapper[4834]: I0130 22:18:33.455239 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 30 22:18:35 crc kubenswrapper[4834]: I0130 22:18:35.484714 4834 generic.go:334] "Generic (PLEG): container finished" podID="e6968481-0d80-4ecc-962c-136afc005858" containerID="ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec" exitCode=0 Jan 30 22:18:35 crc kubenswrapper[4834]: I0130 22:18:35.484782 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b455c" event={"ID":"e6968481-0d80-4ecc-962c-136afc005858","Type":"ContainerDied","Data":"ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec"} Jan 30 22:18:36 crc kubenswrapper[4834]: I0130 22:18:36.497386 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b455c" event={"ID":"e6968481-0d80-4ecc-962c-136afc005858","Type":"ContainerStarted","Data":"31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874"} Jan 30 22:18:36 crc kubenswrapper[4834]: I0130 22:18:36.529479 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b455c" podStartSLOduration=3.097387747 podStartE2EDuration="5.529460091s" podCreationTimestamp="2026-01-30 22:18:31 +0000 UTC" firstStartedPulling="2026-01-30 22:18:33.454960864 +0000 UTC m=+3764.608107002" lastFinishedPulling="2026-01-30 22:18:35.887033208 +0000 UTC m=+3767.040179346" observedRunningTime="2026-01-30 22:18:36.526451826 +0000 UTC m=+3767.679597994" watchObservedRunningTime="2026-01-30 22:18:36.529460091 +0000 UTC m=+3767.682606229" Jan 30 22:18:42 crc kubenswrapper[4834]: I0130 22:18:42.098682 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:42 crc kubenswrapper[4834]: I0130 22:18:42.099137 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:42 crc kubenswrapper[4834]: I0130 22:18:42.470050 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:42 crc kubenswrapper[4834]: I0130 22:18:42.635782 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:42 crc kubenswrapper[4834]: I0130 22:18:42.713331 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b455c"] Jan 30 22:18:44 crc kubenswrapper[4834]: I0130 22:18:44.530910 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:18:44 crc kubenswrapper[4834]: E0130 22:18:44.531808 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:18:44 crc kubenswrapper[4834]: I0130 22:18:44.570765 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b455c" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="registry-server" containerID="cri-o://31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874" gracePeriod=2 Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.108543 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.254688 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8hw9\" (UniqueName: \"kubernetes.io/projected/e6968481-0d80-4ecc-962c-136afc005858-kube-api-access-n8hw9\") pod \"e6968481-0d80-4ecc-962c-136afc005858\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.254780 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-catalog-content\") pod \"e6968481-0d80-4ecc-962c-136afc005858\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.254831 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-utilities\") pod \"e6968481-0d80-4ecc-962c-136afc005858\" (UID: \"e6968481-0d80-4ecc-962c-136afc005858\") " Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.255791 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-utilities" (OuterVolumeSpecName: "utilities") pod "e6968481-0d80-4ecc-962c-136afc005858" (UID: "e6968481-0d80-4ecc-962c-136afc005858"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.261111 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6968481-0d80-4ecc-962c-136afc005858-kube-api-access-n8hw9" (OuterVolumeSpecName: "kube-api-access-n8hw9") pod "e6968481-0d80-4ecc-962c-136afc005858" (UID: "e6968481-0d80-4ecc-962c-136afc005858"). InnerVolumeSpecName "kube-api-access-n8hw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.284602 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6968481-0d80-4ecc-962c-136afc005858" (UID: "e6968481-0d80-4ecc-962c-136afc005858"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.357008 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8hw9\" (UniqueName: \"kubernetes.io/projected/e6968481-0d80-4ecc-962c-136afc005858-kube-api-access-n8hw9\") on node \"crc\" DevicePath \"\"" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.357041 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.357050 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6968481-0d80-4ecc-962c-136afc005858-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.579707 4834 generic.go:334] "Generic (PLEG): container finished" podID="9efbd46f-63c9-4934-9888-90224238632c" containerID="4871983d95be849e47f13bc398b81733014ba355acb24e0ab9642b47e286994e" exitCode=0 Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.579786 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" event={"ID":"9efbd46f-63c9-4934-9888-90224238632c","Type":"ContainerDied","Data":"4871983d95be849e47f13bc398b81733014ba355acb24e0ab9642b47e286994e"} Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.580514 4834 scope.go:117] "RemoveContainer" containerID="4871983d95be849e47f13bc398b81733014ba355acb24e0ab9642b47e286994e" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.582125 4834 generic.go:334] "Generic (PLEG): container finished" podID="e6968481-0d80-4ecc-962c-136afc005858" containerID="31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874" exitCode=0 Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.582172 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b455c" event={"ID":"e6968481-0d80-4ecc-962c-136afc005858","Type":"ContainerDied","Data":"31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874"} Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.582214 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b455c" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.582233 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b455c" event={"ID":"e6968481-0d80-4ecc-962c-136afc005858","Type":"ContainerDied","Data":"f832a98bbc879087aaabfc47c99fc4c8e64a9ab2015927ba7550c1eda4eeb99b"} Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.582267 4834 scope.go:117] "RemoveContainer" containerID="31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.611728 4834 scope.go:117] "RemoveContainer" containerID="ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.638966 4834 scope.go:117] "RemoveContainer" containerID="432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.644251 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b455c"] Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.654835 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b455c"] Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.678887 4834 scope.go:117] "RemoveContainer" containerID="31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874" Jan 30 22:18:45 crc kubenswrapper[4834]: E0130 22:18:45.682802 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874\": container with ID starting with 31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874 not found: ID does not exist" containerID="31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.682853 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874"} err="failed to get container status \"31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874\": rpc error: code = NotFound desc = could not find container \"31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874\": container with ID starting with 31d27a87dd703e0efaeb7d059fee88962791262ea2cabd4dcc5c38b3b2402874 not found: ID does not exist" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.682883 4834 scope.go:117] "RemoveContainer" containerID="ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec" Jan 30 22:18:45 crc kubenswrapper[4834]: E0130 22:18:45.683336 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec\": container with ID starting with ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec not found: ID does not exist" containerID="ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.683374 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec"} err="failed to get container status \"ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec\": rpc error: code = NotFound desc = could not find container \"ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec\": container with ID starting with ddccd67df0c7ccf6b571dd426a734c13ecedf4e293ee9b58cffdb7226026c1ec not found: ID does not exist" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.683409 4834 scope.go:117] "RemoveContainer" containerID="432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a" Jan 30 22:18:45 crc kubenswrapper[4834]: E0130 22:18:45.683985 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a\": container with ID starting with 432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a not found: ID does not exist" containerID="432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.684023 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a"} err="failed to get container status \"432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a\": rpc error: code = NotFound desc = could not find container \"432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a\": container with ID starting with 432197f7ed20676d24c963225909e36e062f8476ba6ff108dc76c9d279c0725a not found: ID does not exist" Jan 30 22:18:45 crc kubenswrapper[4834]: I0130 22:18:45.811197 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrmdk_must-gather-lcnbh_9efbd46f-63c9-4934-9888-90224238632c/gather/0.log" Jan 30 22:18:47 crc kubenswrapper[4834]: I0130 22:18:47.543000 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6968481-0d80-4ecc-962c-136afc005858" path="/var/lib/kubelet/pods/e6968481-0d80-4ecc-962c-136afc005858/volumes" Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.491062 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zrmdk/must-gather-lcnbh"] Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.491879 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="copy" containerID="cri-o://a797184afdb0b9b030db81947817c0c181bc79a00500cfbe6e726a10a161a769" gracePeriod=2 Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.502706 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zrmdk/must-gather-lcnbh"] Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.675222 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrmdk_must-gather-lcnbh_9efbd46f-63c9-4934-9888-90224238632c/copy/0.log" Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.675728 4834 generic.go:334] "Generic (PLEG): container finished" podID="9efbd46f-63c9-4934-9888-90224238632c" containerID="a797184afdb0b9b030db81947817c0c181bc79a00500cfbe6e726a10a161a769" exitCode=143 Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.930428 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrmdk_must-gather-lcnbh_9efbd46f-63c9-4934-9888-90224238632c/copy/0.log" Jan 30 22:18:54 crc kubenswrapper[4834]: I0130 22:18:54.930941 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.074697 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9efbd46f-63c9-4934-9888-90224238632c-must-gather-output\") pod \"9efbd46f-63c9-4934-9888-90224238632c\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.075068 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcdxr\" (UniqueName: \"kubernetes.io/projected/9efbd46f-63c9-4934-9888-90224238632c-kube-api-access-gcdxr\") pod \"9efbd46f-63c9-4934-9888-90224238632c\" (UID: \"9efbd46f-63c9-4934-9888-90224238632c\") " Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.087823 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9efbd46f-63c9-4934-9888-90224238632c-kube-api-access-gcdxr" (OuterVolumeSpecName: "kube-api-access-gcdxr") pod "9efbd46f-63c9-4934-9888-90224238632c" (UID: "9efbd46f-63c9-4934-9888-90224238632c"). InnerVolumeSpecName "kube-api-access-gcdxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.178109 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcdxr\" (UniqueName: \"kubernetes.io/projected/9efbd46f-63c9-4934-9888-90224238632c-kube-api-access-gcdxr\") on node \"crc\" DevicePath \"\"" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.275874 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9efbd46f-63c9-4934-9888-90224238632c-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9efbd46f-63c9-4934-9888-90224238632c" (UID: "9efbd46f-63c9-4934-9888-90224238632c"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.280412 4834 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9efbd46f-63c9-4934-9888-90224238632c-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.541429 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9efbd46f-63c9-4934-9888-90224238632c" path="/var/lib/kubelet/pods/9efbd46f-63c9-4934-9888-90224238632c/volumes" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.687773 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zrmdk_must-gather-lcnbh_9efbd46f-63c9-4934-9888-90224238632c/copy/0.log" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.688244 4834 scope.go:117] "RemoveContainer" containerID="a797184afdb0b9b030db81947817c0c181bc79a00500cfbe6e726a10a161a769" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.688354 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zrmdk/must-gather-lcnbh" Jan 30 22:18:55 crc kubenswrapper[4834]: I0130 22:18:55.713715 4834 scope.go:117] "RemoveContainer" containerID="4871983d95be849e47f13bc398b81733014ba355acb24e0ab9642b47e286994e" Jan 30 22:18:59 crc kubenswrapper[4834]: I0130 22:18:59.538383 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:18:59 crc kubenswrapper[4834]: E0130 22:18:59.539254 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:19:14 crc kubenswrapper[4834]: I0130 22:19:14.531492 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:19:14 crc kubenswrapper[4834]: E0130 22:19:14.532276 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:19:29 crc kubenswrapper[4834]: I0130 22:19:29.541496 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:19:29 crc kubenswrapper[4834]: E0130 22:19:29.542291 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:19:42 crc kubenswrapper[4834]: I0130 22:19:42.531474 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:19:42 crc kubenswrapper[4834]: E0130 22:19:42.532273 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:19:56 crc kubenswrapper[4834]: I0130 22:19:56.532804 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:19:56 crc kubenswrapper[4834]: E0130 22:19:56.534110 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:20:10 crc kubenswrapper[4834]: I0130 22:20:10.531460 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:20:10 crc kubenswrapper[4834]: E0130 22:20:10.532378 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.602165 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-45qz7"] Jan 30 22:20:14 crc kubenswrapper[4834]: E0130 22:20:14.603169 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="registry-server" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603186 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="registry-server" Jan 30 22:20:14 crc kubenswrapper[4834]: E0130 22:20:14.603212 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="copy" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603220 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="copy" Jan 30 22:20:14 crc kubenswrapper[4834]: E0130 22:20:14.603234 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="gather" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603242 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="gather" Jan 30 22:20:14 crc kubenswrapper[4834]: E0130 22:20:14.603254 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="extract-content" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603262 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="extract-content" Jan 30 22:20:14 crc kubenswrapper[4834]: E0130 22:20:14.603300 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="extract-utilities" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603309 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="extract-utilities" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603579 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6968481-0d80-4ecc-962c-136afc005858" containerName="registry-server" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603597 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="copy" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.603609 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9efbd46f-63c9-4934-9888-90224238632c" containerName="gather" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.605419 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.615609 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45qz7"] Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.774112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-utilities\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.774312 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-catalog-content\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.774368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zppjs\" (UniqueName: \"kubernetes.io/projected/9777b895-d9e0-4777-8ef7-5a296dc47522-kube-api-access-zppjs\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.876274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-catalog-content\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.876360 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zppjs\" (UniqueName: \"kubernetes.io/projected/9777b895-d9e0-4777-8ef7-5a296dc47522-kube-api-access-zppjs\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.876444 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-utilities\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.876900 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-catalog-content\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.876953 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-utilities\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.901198 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zppjs\" (UniqueName: \"kubernetes.io/projected/9777b895-d9e0-4777-8ef7-5a296dc47522-kube-api-access-zppjs\") pod \"community-operators-45qz7\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:14 crc kubenswrapper[4834]: I0130 22:20:14.926506 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:15 crc kubenswrapper[4834]: I0130 22:20:15.809509 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-45qz7"] Jan 30 22:20:16 crc kubenswrapper[4834]: I0130 22:20:16.543033 4834 generic.go:334] "Generic (PLEG): container finished" podID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerID="6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b" exitCode=0 Jan 30 22:20:16 crc kubenswrapper[4834]: I0130 22:20:16.543160 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerDied","Data":"6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b"} Jan 30 22:20:16 crc kubenswrapper[4834]: I0130 22:20:16.543458 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerStarted","Data":"8bdbcdb5f01b8cf2d592d5c7440bba613243c3faff6102caafdfabe89f2b5181"} Jan 30 22:20:18 crc kubenswrapper[4834]: I0130 22:20:18.566234 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerStarted","Data":"dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da"} Jan 30 22:20:20 crc kubenswrapper[4834]: I0130 22:20:20.585187 4834 generic.go:334] "Generic (PLEG): container finished" podID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerID="dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da" exitCode=0 Jan 30 22:20:20 crc kubenswrapper[4834]: I0130 22:20:20.585246 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerDied","Data":"dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da"} Jan 30 22:20:21 crc kubenswrapper[4834]: I0130 22:20:21.597312 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerStarted","Data":"45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116"} Jan 30 22:20:21 crc kubenswrapper[4834]: I0130 22:20:21.620620 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-45qz7" podStartSLOduration=3.119195877 podStartE2EDuration="7.620594872s" podCreationTimestamp="2026-01-30 22:20:14 +0000 UTC" firstStartedPulling="2026-01-30 22:20:16.545211048 +0000 UTC m=+3867.698357196" lastFinishedPulling="2026-01-30 22:20:21.046610053 +0000 UTC m=+3872.199756191" observedRunningTime="2026-01-30 22:20:21.619089649 +0000 UTC m=+3872.772235827" watchObservedRunningTime="2026-01-30 22:20:21.620594872 +0000 UTC m=+3872.773741010" Jan 30 22:20:24 crc kubenswrapper[4834]: I0130 22:20:24.927457 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:24 crc kubenswrapper[4834]: I0130 22:20:24.928573 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:24 crc kubenswrapper[4834]: I0130 22:20:24.996286 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:25 crc kubenswrapper[4834]: I0130 22:20:25.532487 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:20:25 crc kubenswrapper[4834]: E0130 22:20:25.533544 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:20:34 crc kubenswrapper[4834]: I0130 22:20:34.976078 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:35 crc kubenswrapper[4834]: I0130 22:20:35.035743 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45qz7"] Jan 30 22:20:35 crc kubenswrapper[4834]: I0130 22:20:35.734937 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-45qz7" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="registry-server" containerID="cri-o://45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116" gracePeriod=2 Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.228502 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.365075 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-catalog-content\") pod \"9777b895-d9e0-4777-8ef7-5a296dc47522\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.365116 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-utilities\") pod \"9777b895-d9e0-4777-8ef7-5a296dc47522\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.365273 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zppjs\" (UniqueName: \"kubernetes.io/projected/9777b895-d9e0-4777-8ef7-5a296dc47522-kube-api-access-zppjs\") pod \"9777b895-d9e0-4777-8ef7-5a296dc47522\" (UID: \"9777b895-d9e0-4777-8ef7-5a296dc47522\") " Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.366021 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-utilities" (OuterVolumeSpecName: "utilities") pod "9777b895-d9e0-4777-8ef7-5a296dc47522" (UID: "9777b895-d9e0-4777-8ef7-5a296dc47522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.376054 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9777b895-d9e0-4777-8ef7-5a296dc47522-kube-api-access-zppjs" (OuterVolumeSpecName: "kube-api-access-zppjs") pod "9777b895-d9e0-4777-8ef7-5a296dc47522" (UID: "9777b895-d9e0-4777-8ef7-5a296dc47522"). InnerVolumeSpecName "kube-api-access-zppjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.414606 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9777b895-d9e0-4777-8ef7-5a296dc47522" (UID: "9777b895-d9e0-4777-8ef7-5a296dc47522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.467532 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zppjs\" (UniqueName: \"kubernetes.io/projected/9777b895-d9e0-4777-8ef7-5a296dc47522-kube-api-access-zppjs\") on node \"crc\" DevicePath \"\"" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.467566 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.467580 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9777b895-d9e0-4777-8ef7-5a296dc47522-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.744294 4834 generic.go:334] "Generic (PLEG): container finished" podID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerID="45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116" exitCode=0 Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.744404 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerDied","Data":"45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116"} Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.744465 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-45qz7" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.744667 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-45qz7" event={"ID":"9777b895-d9e0-4777-8ef7-5a296dc47522","Type":"ContainerDied","Data":"8bdbcdb5f01b8cf2d592d5c7440bba613243c3faff6102caafdfabe89f2b5181"} Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.744684 4834 scope.go:117] "RemoveContainer" containerID="45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.769078 4834 scope.go:117] "RemoveContainer" containerID="dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.781004 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-45qz7"] Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.790058 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-45qz7"] Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.795745 4834 scope.go:117] "RemoveContainer" containerID="6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.834739 4834 scope.go:117] "RemoveContainer" containerID="45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116" Jan 30 22:20:36 crc kubenswrapper[4834]: E0130 22:20:36.835215 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116\": container with ID starting with 45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116 not found: ID does not exist" containerID="45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.835260 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116"} err="failed to get container status \"45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116\": rpc error: code = NotFound desc = could not find container \"45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116\": container with ID starting with 45ea5d85d219ead093f07a606e03851d962dea2ba12353aadbaf0111c7dd5116 not found: ID does not exist" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.835285 4834 scope.go:117] "RemoveContainer" containerID="dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da" Jan 30 22:20:36 crc kubenswrapper[4834]: E0130 22:20:36.835760 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da\": container with ID starting with dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da not found: ID does not exist" containerID="dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.835807 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da"} err="failed to get container status \"dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da\": rpc error: code = NotFound desc = could not find container \"dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da\": container with ID starting with dbceb13723a2ca9507f2b0669a440859b11070a9251cd0f3c68568238e3799da not found: ID does not exist" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.835832 4834 scope.go:117] "RemoveContainer" containerID="6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b" Jan 30 22:20:36 crc kubenswrapper[4834]: E0130 22:20:36.836191 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b\": container with ID starting with 6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b not found: ID does not exist" containerID="6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b" Jan 30 22:20:36 crc kubenswrapper[4834]: I0130 22:20:36.836218 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b"} err="failed to get container status \"6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b\": rpc error: code = NotFound desc = could not find container \"6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b\": container with ID starting with 6c210fb3d08ec58c47d550cd8a683c68e976f07cac0e1af886d36a728ce1608b not found: ID does not exist" Jan 30 22:20:37 crc kubenswrapper[4834]: I0130 22:20:37.548305 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" path="/var/lib/kubelet/pods/9777b895-d9e0-4777-8ef7-5a296dc47522/volumes" Jan 30 22:20:39 crc kubenswrapper[4834]: I0130 22:20:39.531462 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:20:39 crc kubenswrapper[4834]: E0130 22:20:39.532490 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:20:51 crc kubenswrapper[4834]: I0130 22:20:51.532999 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:20:51 crc kubenswrapper[4834]: E0130 22:20:51.533687 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:21:02 crc kubenswrapper[4834]: I0130 22:21:02.531188 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:21:02 crc kubenswrapper[4834]: E0130 22:21:02.531928 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:21:16 crc kubenswrapper[4834]: I0130 22:21:16.532767 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:21:16 crc kubenswrapper[4834]: E0130 22:21:16.533637 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:21:28 crc kubenswrapper[4834]: I0130 22:21:28.532700 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:21:28 crc kubenswrapper[4834]: E0130 22:21:28.533468 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drghn_openshift-machine-config-operator(296cf2a5-374e-4730-9d40-8abb93c8e237)\"" pod="openshift-machine-config-operator/machine-config-daemon-drghn" podUID="296cf2a5-374e-4730-9d40-8abb93c8e237" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.531416 4834 scope.go:117] "RemoveContainer" containerID="a9c8963349924e75e82a99be8c39e35f96c47507197d127c0ed819b1436ebd6e" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.876668 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mb5nq"] Jan 30 22:21:40 crc kubenswrapper[4834]: E0130 22:21:40.878986 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="extract-utilities" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.879025 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="extract-utilities" Jan 30 22:21:40 crc kubenswrapper[4834]: E0130 22:21:40.879051 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="extract-content" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.879058 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="extract-content" Jan 30 22:21:40 crc kubenswrapper[4834]: E0130 22:21:40.879107 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="registry-server" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.879113 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="registry-server" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.879680 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9777b895-d9e0-4777-8ef7-5a296dc47522" containerName="registry-server" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.890569 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.897131 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mb5nq"] Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.973919 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwjv\" (UniqueName: \"kubernetes.io/projected/389c204c-151a-4b0c-a99e-6f54de97086c-kube-api-access-9xwjv\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.974040 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-catalog-content\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:40 crc kubenswrapper[4834]: I0130 22:21:40.974213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-utilities\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.075606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwjv\" (UniqueName: \"kubernetes.io/projected/389c204c-151a-4b0c-a99e-6f54de97086c-kube-api-access-9xwjv\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.075694 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-catalog-content\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.075783 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-utilities\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.076444 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-utilities\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.076610 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-catalog-content\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.096147 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwjv\" (UniqueName: \"kubernetes.io/projected/389c204c-151a-4b0c-a99e-6f54de97086c-kube-api-access-9xwjv\") pod \"certified-operators-mb5nq\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.240655 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.472197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drghn" event={"ID":"296cf2a5-374e-4730-9d40-8abb93c8e237","Type":"ContainerStarted","Data":"b803d2bf6c6622ebc487b10bd8bfc40888e82f789b7b805077f7f89d886b86de"} Jan 30 22:21:41 crc kubenswrapper[4834]: I0130 22:21:41.776167 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mb5nq"] Jan 30 22:21:42 crc kubenswrapper[4834]: I0130 22:21:42.483689 4834 generic.go:334] "Generic (PLEG): container finished" podID="389c204c-151a-4b0c-a99e-6f54de97086c" containerID="cb51744d3f148635263ee7dfe9a937fd0e06c3fb315c9f01479f00820682f16e" exitCode=0 Jan 30 22:21:42 crc kubenswrapper[4834]: I0130 22:21:42.483739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerDied","Data":"cb51744d3f148635263ee7dfe9a937fd0e06c3fb315c9f01479f00820682f16e"} Jan 30 22:21:42 crc kubenswrapper[4834]: I0130 22:21:42.484005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerStarted","Data":"8d7689279611dff0d036a7484ac8076a264ec66158b021d8d4453c751210b332"} Jan 30 22:21:43 crc kubenswrapper[4834]: I0130 22:21:43.496023 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerStarted","Data":"01d2fb613339a797511527e19a832e0a4d88bf149e4e1bf6cf10cd75ce73615f"} Jan 30 22:21:45 crc kubenswrapper[4834]: I0130 22:21:45.518672 4834 generic.go:334] "Generic (PLEG): container finished" podID="389c204c-151a-4b0c-a99e-6f54de97086c" containerID="01d2fb613339a797511527e19a832e0a4d88bf149e4e1bf6cf10cd75ce73615f" exitCode=0 Jan 30 22:21:45 crc kubenswrapper[4834]: I0130 22:21:45.518742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerDied","Data":"01d2fb613339a797511527e19a832e0a4d88bf149e4e1bf6cf10cd75ce73615f"} Jan 30 22:21:46 crc kubenswrapper[4834]: I0130 22:21:46.531538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerStarted","Data":"ba625d3c0b6b72b5f133858d0634e5d6c8f4591bf9d8ac9cc6c4816dd0b3037a"} Jan 30 22:21:46 crc kubenswrapper[4834]: I0130 22:21:46.553724 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mb5nq" podStartSLOduration=3.025560521 podStartE2EDuration="6.553703259s" podCreationTimestamp="2026-01-30 22:21:40 +0000 UTC" firstStartedPulling="2026-01-30 22:21:42.48696646 +0000 UTC m=+3953.640112618" lastFinishedPulling="2026-01-30 22:21:46.015109218 +0000 UTC m=+3957.168255356" observedRunningTime="2026-01-30 22:21:46.548506942 +0000 UTC m=+3957.701653090" watchObservedRunningTime="2026-01-30 22:21:46.553703259 +0000 UTC m=+3957.706849397" Jan 30 22:21:51 crc kubenswrapper[4834]: I0130 22:21:51.241158 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:51 crc kubenswrapper[4834]: I0130 22:21:51.241820 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:51 crc kubenswrapper[4834]: I0130 22:21:51.299744 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:51 crc kubenswrapper[4834]: I0130 22:21:51.638705 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:51 crc kubenswrapper[4834]: I0130 22:21:51.680770 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mb5nq"] Jan 30 22:21:53 crc kubenswrapper[4834]: I0130 22:21:53.589824 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mb5nq" podUID="389c204c-151a-4b0c-a99e-6f54de97086c" containerName="registry-server" containerID="cri-o://ba625d3c0b6b72b5f133858d0634e5d6c8f4591bf9d8ac9cc6c4816dd0b3037a" gracePeriod=2 Jan 30 22:21:55 crc kubenswrapper[4834]: I0130 22:21:55.613909 4834 generic.go:334] "Generic (PLEG): container finished" podID="389c204c-151a-4b0c-a99e-6f54de97086c" containerID="ba625d3c0b6b72b5f133858d0634e5d6c8f4591bf9d8ac9cc6c4816dd0b3037a" exitCode=0 Jan 30 22:21:55 crc kubenswrapper[4834]: I0130 22:21:55.613942 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerDied","Data":"ba625d3c0b6b72b5f133858d0634e5d6c8f4591bf9d8ac9cc6c4816dd0b3037a"} Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.034412 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.184819 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-catalog-content\") pod \"389c204c-151a-4b0c-a99e-6f54de97086c\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.184973 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xwjv\" (UniqueName: \"kubernetes.io/projected/389c204c-151a-4b0c-a99e-6f54de97086c-kube-api-access-9xwjv\") pod \"389c204c-151a-4b0c-a99e-6f54de97086c\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.185026 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-utilities\") pod \"389c204c-151a-4b0c-a99e-6f54de97086c\" (UID: \"389c204c-151a-4b0c-a99e-6f54de97086c\") " Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.186067 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-utilities" (OuterVolumeSpecName: "utilities") pod "389c204c-151a-4b0c-a99e-6f54de97086c" (UID: "389c204c-151a-4b0c-a99e-6f54de97086c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.192045 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389c204c-151a-4b0c-a99e-6f54de97086c-kube-api-access-9xwjv" (OuterVolumeSpecName: "kube-api-access-9xwjv") pod "389c204c-151a-4b0c-a99e-6f54de97086c" (UID: "389c204c-151a-4b0c-a99e-6f54de97086c"). InnerVolumeSpecName "kube-api-access-9xwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.242270 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "389c204c-151a-4b0c-a99e-6f54de97086c" (UID: "389c204c-151a-4b0c-a99e-6f54de97086c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.287274 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.287314 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xwjv\" (UniqueName: \"kubernetes.io/projected/389c204c-151a-4b0c-a99e-6f54de97086c-kube-api-access-9xwjv\") on node \"crc\" DevicePath \"\"" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.287324 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/389c204c-151a-4b0c-a99e-6f54de97086c-utilities\") on node \"crc\" DevicePath \"\"" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.624801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mb5nq" event={"ID":"389c204c-151a-4b0c-a99e-6f54de97086c","Type":"ContainerDied","Data":"8d7689279611dff0d036a7484ac8076a264ec66158b021d8d4453c751210b332"} Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.624855 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mb5nq" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.625134 4834 scope.go:117] "RemoveContainer" containerID="ba625d3c0b6b72b5f133858d0634e5d6c8f4591bf9d8ac9cc6c4816dd0b3037a" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.645760 4834 scope.go:117] "RemoveContainer" containerID="01d2fb613339a797511527e19a832e0a4d88bf149e4e1bf6cf10cd75ce73615f" Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.663635 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mb5nq"] Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.676445 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mb5nq"] Jan 30 22:21:56 crc kubenswrapper[4834]: I0130 22:21:56.678814 4834 scope.go:117] "RemoveContainer" containerID="cb51744d3f148635263ee7dfe9a937fd0e06c3fb315c9f01479f00820682f16e" Jan 30 22:21:57 crc kubenswrapper[4834]: I0130 22:21:57.549731 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389c204c-151a-4b0c-a99e-6f54de97086c" path="/var/lib/kubelet/pods/389c204c-151a-4b0c-a99e-6f54de97086c/volumes"